AI Voice Technology Just Got INSANE (ElevenLabs GenFM Demo + More)
AI transcript
🕒
Việt
中文
0:00:07 >> Will come on some Next Wave podcast with Matt Wolf and Nathan Lanz. 0:00:10 >> It feels like probably the biggest unlock. 0:00:14 You could produce content in one language and then make it available in 33 different languages. 0:00:18 >> Yes, to be able to just talk to AI now and to build so many different apps. 0:00:23 It’s like, this is a time I keep telling people where you can be the idea person who ships. 0:00:26 >> Hey, welcome to the Next Wave podcast. 0:00:29 I’m Matt Wolf. I’m here with Nathan Lanz. 0:00:32 And in this AI landscape, this AI world that we’re in, 0:00:35 there are so many different tools out there. 0:00:37 And most people are sitting around going, 0:00:39 “I don’t know how to use this in my business. 0:00:41 I don’t know what this is good for.” 0:00:42 Well, today, we’re going to talk about a tool 0:00:45 that people are actually implementing in their business 0:00:47 and people are actually generating revenue from 0:00:52 and using really successfully to generate side hustle income. 0:00:56 Today, we’re talking to Amar Reshi from Eleven Labs. 0:00:57 And if you’re not familiar with Eleven Labs, 0:01:00 it is a tool that allows you to train your own voice into it. 0:01:03 It creates podcasts. It creates sound effects. 0:01:07 It does all sorts of amazing things with audio. 0:01:11 And Amar is going to break down all of that for us on this show 0:01:14 and even show us inside the app how to use it, 0:01:16 how to get the most out of it. 0:01:18 And it’s just got a ton of cool features. 0:01:21 >> The big unlock here is the fact that with Eleven Labs, 0:01:23 you could produce content in one language 0:01:26 and then have it go out to 33 different languages. 0:01:28 So in business, that is such a huge unlock 0:01:30 that you can now reach all these new markets 0:01:31 that you couldn’t before. 0:01:34 >> Yeah, this tool is absolutely game changing. 0:01:35 We use it ourselves. 0:01:37 It is a super fun tool, too. 0:01:38 You’re going to love it. 0:01:41 And so let’s go ahead and just break it all down with Amar Reshi. 0:01:46 >> Look, if you’re curious about custom GPTs 0:01:49 or our pro that’s looking to up your game, listen up. 0:01:53 I’ve actually built custom GPTs that help me do the research 0:01:56 and planning for my YouTube videos 0:01:58 and building my own custom GPTs 0:02:00 has truly given me a huge advantage. 0:02:02 Well, I want to do the same for you. 0:02:04 HubSpot has just dropped a full guide 0:02:06 on how you can create your own custom GPT 0:02:08 and they’ve taken the guesswork out of it. 0:02:11 We’ve included templates and a step-by-step guide 0:02:13 to design and implement custom models 0:02:16 so you can focus on the best part, actually building it. 0:02:19 If you want it, you can get it at the link in the description below. 0:02:20 Now, back to the show. 0:02:25 [MUSIC] 0:02:28 >> We’re here with Amar Reshi from 11 Labs 0:02:31 and we’re going to talk about some of the really cool stuff 0:02:34 that 11 Labs has been rolling out recently. 0:02:36 So Amar, let’s jump in real quick though 0:02:38 and give a little bit of background on you. 0:02:40 What were you doing before 11 Labs 0:02:41 and how did you get involved with them? 0:02:42 What’s your role there? 0:02:44 Kind of give us the lay of the land a little bit. 0:02:46 >> Yeah, yeah, sounds good. 0:02:48 I caught the AI bug like everyone else 0:02:52 and around late 2022, I fell into it all. 0:02:55 Really on the deep end when I published a children’s book 0:02:57 that accidentally went viral. 0:02:59 You remember that map because that’s how we met. 0:03:00 >> Yeah, yeah. 0:03:02 >> And so after that episode, 0:03:06 I basically started to just share stuff with AI every month or so. 0:03:07 It was a new experiment, 0:03:09 new tools are coming out all the time, right? 0:03:12 We just got Runway, we just got Pika and all these things. 0:03:16 And so it was a new thing every month or so, 0:03:17 sharing each creation. 0:03:20 And 11 Labs was a tool I stumbled across. 0:03:22 I started using it for a bunch of my experiments, 0:03:23 voice-overs and videos, 0:03:28 videos, characters voiced in those AI-generated videos. 0:03:31 And yeah, I really, really enjoyed using the tool, 0:03:33 but I was burning credits and I was- 0:03:37 So you’re like, hire me, hire me, give me free credits. 0:03:41 Well, that’s hilarious. 0:03:43 It’s like the college student working at like Burger King 0:03:45 or something like getting free burgers, right? 0:03:46 It’s like- 0:03:49 But no, I threw that, I essentially was like, 0:03:51 how do I get in touch with 11 Labs? 0:03:53 I wonder if they’d enjoy using this product. 0:03:56 I spoke to the founder, we had a great conversation, 0:03:59 and he was also looking to hire someone to lead the design team. 0:04:02 And that’s essentially how it started. 0:04:05 We hit it off and yeah, the rest is history. 0:04:09 And I’m head of design at 11 Labs and having a blast. 0:04:09 >> Very cool. 0:04:13 Yeah, I remember we were doing some Twitter spaces for a while. 0:04:14 We were on a pretty consistent streak 0:04:16 of doing like these weekly Twitter spaces. 0:04:19 And I think that’s how we initially connected. 0:04:21 And you had the book that you put out. 0:04:24 And I think that book was actually one of the first times 0:04:25 that I started to realize, 0:04:28 oh, AI is actually kind of a controversial space. 0:04:30 Like up until that point, I was just like, 0:04:32 look at these cool tools, these are so much fun. 0:04:36 But I started to see some of the backlash bubble up 0:04:39 around that time, around AI. 0:04:42 And so I do remember that very well. 0:04:45 I remember you had kind of like a mixed experience 0:04:46 putting out that book, let’s say. 0:04:50 >> Yeah, mixed is probably the right word. 0:04:53 I had a bit of an Oppenheimer moment looking in the lake. 0:04:55 Like what have I done to art? 0:04:59 But no, it was a great learning experience 0:05:02 because I think it showed me someone in the tech bubble 0:05:03 in San Francisco. 0:05:04 And also there’s someone who always just sees 0:05:06 the optimistic side of all of this. 0:05:08 Hey, actually there’s a whole set of people 0:05:10 who have a very visceral reaction to this. 0:05:12 And it was worth seeing their side of the story, 0:05:15 even if the way they reacted was maybe a little harsh. 0:05:18 >> Yeah, I think that’s the best perspective 0:05:21 in the world of AI is to try to see both sides of the token. 0:05:24 Let’s talk about 11 Labs a bit. 0:05:26 11 Labs, I think, first came onto the scene 0:05:28 is like a voice cloning tool. 0:05:30 It was the first one I ever remember coming. 0:05:32 There was voice cloning tools out there, 0:05:35 but they were very obviously AI, right? 0:05:38 Like they were like that Siri Alexa kind of voice 0:05:40 where you can tell it was AI. 0:05:42 11 Labs was the first one I remember popping up 0:05:45 where I was like, I could barely tell the difference 0:05:47 if this is a human or not anymore. 0:05:50 And then I was able to actually clone my own voice into it 0:05:53 and make like little recordings with my own voice. 0:05:57 But I mean, 11 Labs has come so far since then. 0:06:01 Like, I guess what is 11 Labs like big picture mission? 0:06:03 Like who are they trying to help? 0:06:05 And like who are these tools for? 0:06:07 Because it seems like there’s a pretty wide range 0:06:09 of abilities it has now. 0:06:10 Yeah, yeah, yeah, yeah. 0:06:12 So a funny story is the way it started 0:06:16 was the two co-founders in Poland were watching a show 0:06:17 on one of the streaming sites. 0:06:18 It was all dubbed in Polish. 0:06:22 But in Poland, that show was dubbed by one voice actor 0:06:24 who did the men and the women. 0:06:27 So you can imagine what an experience that was like. 0:06:29 And I kind of took them down the rabbit hole. 0:06:31 There’s got to be a better way, right? 0:06:34 And so to start with the problem we all had, 0:06:36 whenever you watch a dub show and it sucks 0:06:38 and you’re like, oh, I wish there was a better way. 0:06:40 I don’t want to read the subtitles. 0:06:42 And so that kind of took them down the dubbing rabbit hole. 0:06:45 But the mission there and the mission, I think, 0:06:48 still holds true, which is we say internally, 0:06:50 it’s like keep content universally accessible, 0:06:51 any language, any voice. 0:06:54 And that’s where a lot of the dubbing stuff started. 0:06:57 But then also the huge voice library now 0:07:00 with 32 plus different languages, the model support. 0:07:01 And so it still is that. 0:07:03 It’s we really want to make content engaging 0:07:05 any language, any voice. 0:07:07 You should have the same great authentic experience 0:07:08 that you had in your language. 0:07:10 Another language for other people. 0:07:12 But what’s been amazing is that platform that started 0:07:16 as this voice replication and dubbing thing 0:07:20 is now just a complete AI audio platform in suite 0:07:22 where you just generate sounds, soon music. 0:07:25 So there’s going to be all sorts of fun stuff coming there. 0:07:28 When it comes to celebrity voices and stuff like that, 0:07:33 what’s 11 labs thoughts or approach on cloning voices 0:07:34 without permission? 0:07:37 Because I do know that’s a worry of a lot of voice actors, 0:07:39 especially the stance internally. 0:07:41 And at least the things that I can speak to are, 0:07:44 you know, we take the deep fake stuff like very seriously 0:07:48 and and the safety bits are actually a feature built in 0:07:50 that we think through from the beginning. 0:07:52 So very much like, how do we recognize voice signatures 0:07:54 for these famous voices? 0:07:57 How are we seeing what’s generated moderating that? 0:07:58 So that stuff is like all monitored. 0:08:02 And I think the team has kind of been working on that nonstop 0:08:05 since the beginning, which is why even this election cycle 0:08:07 is very smooth and everything all went well. 0:08:10 And I think, yeah, but on the celebrity voices thing, 0:08:11 we do want to work with them, right? 0:08:14 I think it’s like we and we already started 0:08:16 having a bunch on the on the platform. 0:08:18 So on the mobile app, we have Bert Reynolds 0:08:22 and Judy Garland and James Dean and Jerry Garcia 0:08:24 and Deepak Chopra actually just put his voice on too. 0:08:27 And you can like, you can listen to his meditations 0:08:27 in his voice. 0:08:31 So it’s been really fun to kind of start getting 0:08:33 more and more folks on the platform that way. 0:08:36 Well, I would love to sort of dive in and like, 0:08:39 maybe even do some screen sharing and maybe talk about 0:08:43 some of the features that you are, you think are the coolest. 0:08:46 Maybe anybody watching this on YouTube can get a peek. 0:08:49 If they haven’t tried 11 Labs, they can see sort of some 0:08:51 of the stuff that it’s capable of. 0:08:54 And then maybe even take a look at the iPhone app 0:08:56 if we can figure out how to do it technically. 0:08:59 Take a look at the iPhone app because that 11 Labs reader 0:09:00 is really, really cool. 0:09:02 Like you can go and take any article you find online, 0:09:06 any PDF, any, pretty much any source of text, 0:09:09 throw it in there and have somebody read it to you. 0:09:11 And now you can even have it turned into a podcast for you. 0:09:13 So like, that’d be really cool to talk about. 0:09:16 But maybe let’s start with like the desktop app. 0:09:17 Let’s do it. 0:09:18 Let’s do it. 0:09:18 Cool. 0:09:20 Well, okay, we’re in the dashboard here. 0:09:23 And this is essentially where you can start creating 0:09:27 all sorts of stuff with our generative AI audio. 0:09:28 So let’s start with text-to-speech. 0:09:31 This is essentially what most people end up using 0:09:34 because it’s widely applicable for voiceovers in video games, 0:09:37 to YouTube videos, to any sort of way 0:09:40 you might want to use generate speech. 0:09:41 So here I’ve got maybe, you know, 0:09:45 a nice fun opener for a potential podcast. 0:09:47 And let’s just regenerate and hear it. 0:09:48 Hey, everyone. 0:09:50 Welcome back for another deep dive. 0:09:51 Cool. 0:09:53 So you got a sense of Brittany there 0:09:55 who’s in our voice library. 0:09:58 And you can see we’ve got tons of voices as well 0:10:00 for so many different use cases. 0:10:03 So that’s Brittany, our kind of social media style voice. 0:10:04 Hey, everyone. 0:10:06 Welcome back to the channel. 0:10:08 And then you have, you know, trailer voice. 0:10:10 Remember those trailers from the 90s? 0:10:14 In a world where AI voices sound like robot. 0:10:17 Can we do like one for the next wave like that? 0:10:21 Welcome to the Next Wave podcast. 0:10:23 And these voices that you’re sharing, 0:10:24 these are voices that are available 0:10:26 to anybody with an 11 lab subscription, 0:10:29 or these like ones that you’ve personally trained 0:10:30 that are just in your account. 0:10:32 Yeah. 0:10:34 These are all available in the library. 0:10:36 And so the ones you’re seeing are ones 0:10:37 that have just added from the library 0:10:38 available to everyone. 0:10:39 Gotcha. 0:10:39 Cool. 0:10:41 Let’s hear the trailer voice. 0:10:43 Welcome to the Next Wave podcast 0:10:46 with Matt Wolfe and Nathan Lans. 0:10:49 Okay, that’s the new intro. 0:10:50 Yeah, yeah. 0:10:52 I don’t have to record one anymore. 0:10:52 We’ll just use that one. 0:10:54 Yeah, download this and send you the file. 0:10:54 Okay. 0:10:57 Amazing. 0:10:58 Yeah. 0:10:59 And so you’ve got the voices, 0:11:01 we’ve got the different models, 0:11:02 our turbo model, which is cheaper, 0:11:04 but way faster. 0:11:05 And so that’s really great for people 0:11:06 who are building apps 0:11:09 and need really fast reaction times 0:11:11 for the generations. 0:11:14 And then yeah, some legacy models, 0:11:16 if people still prefer how some of the older ones sound, 0:11:17 but we kind of kept that in there 0:11:18 for our power users. 0:11:19 But yeah. 0:11:21 And then real quick, what are the little like, 0:11:22 you’re probably about to get into this 0:11:23 and I’m just sort of getting ahead of myself, 0:11:25 but what are the little like sliders 0:11:27 and can you sort of explain what those do 0:11:29 and how they affect the output? 0:11:30 Yeah, so a few of them. 0:11:32 So stability is essentially, 0:11:34 you can push it to be more variable 0:11:35 and you’ll get more intonation 0:11:36 and stuff like that. 0:11:39 But it’s a little, it gets a little unstable. 0:11:41 So it’s really, if you just want to experiment 0:11:43 with the way the output might play out, 0:11:45 similarity is really useful 0:11:47 if you have a replica of your voice 0:11:48 and you’re trying to decide 0:11:50 how similar you want it to sound 0:11:52 versus maybe lose it a little bit 0:11:53 and try something else. 0:11:55 And then style exaggeration. 0:11:57 Yeah, this one is, 0:11:59 I view it as like a very experimental slider. 0:12:02 It’s like, you’re not sure what you’re going to get, 0:12:04 but you might get a range of emotions. 0:12:07 So it’s feeling a little boring. 0:12:08 Try the style of exaggeration. 0:12:09 Yeah. 0:12:10 And one thing that’s really cool too 0:12:13 is that it’s like, you generate a sentence 0:12:14 and you hear it, you know, like, 0:12:17 I don’t really like the way that came out. 0:12:18 You tweak a slider. 0:12:20 I mean, it seems like you don’t even have to tweak a slider. 0:12:21 You can just generate it again. 0:12:22 It’ll sound a little bit different, 0:12:24 but tweaking the sliders will sort of 0:12:28 make a bigger impact on the regeneration of it. 0:12:29 Oh, 100%. 0:12:31 And the other thing that actually makes a difference 0:12:33 is how you’ve written the text out. 0:12:35 So for instance, if I had done all caps, 0:12:37 it’s actually going to be a bit louder 0:12:38 and more exaggerated. 0:12:41 The exclamation mark is adding more emphasis. 0:12:42 So it actually understands the context 0:12:44 of the text that it’s reading. 0:12:46 Yeah, there’s some handy tips. 0:12:48 Like, I’ve been playing with 11 Labs for, 0:12:50 I don’t know how long 11 Labs has been around, 0:12:52 but I feel like I’ve been playing with it 0:12:54 for at least 18, 19 months now. 0:12:57 I didn’t even know some of that kind of stuff. 0:12:59 So that’s really cool. 0:13:00 Amazing. 0:13:02 My favorite, though, is the voice changer. 0:13:04 And I’m not sure how I’m going to demo it here. 0:13:07 It’s a tricky one, but you can take your own voice 0:13:08 and transform it into another. 0:13:12 So, you know, maybe you want to say that trailer line 0:13:12 in a different way. 0:13:17 We can take that audio and then turn you into David, 0:13:20 the trailer voice, with exactly how you set it. 0:13:23 Yeah. So all the, like, specific inflections 0:13:26 that you might put into the sentence and things like that, 0:13:28 it’s going to follow that exactly, pretty much. 0:13:29 It’ll follow that exactly. 0:13:32 And so this is, honestly, if you really want to direct the voice, 0:13:33 this is the best way to do it. 0:13:36 You have to, you know, be a little bit of a voice actor yourself, 0:13:39 but you’ll get some fun output with this one. 0:13:39 Yeah, yeah. 0:13:43 I wonder if it would work, like, if it’ll take the audio 0:13:46 from the microphone, even though you’re on a podcast. 0:13:47 You really want me to voice that right now? 0:13:52 What about, like, a three-word thing? 0:13:55 Like, let’s freaking go or something like that. 0:13:57 All right, well, I’m going to try the mic or got it over here. 0:13:59 Okay. Okay. 0:14:01 Hopefully it doesn’t take over my mic over here, 0:14:02 but we’ll see. 0:14:04 Three, two, one, let’s go. 0:14:08 Three, two, one, let’s go. 0:14:12 Three, two, one, let’s go. 0:14:14 So perfect. 0:14:14 That’s so cool. 0:14:17 I love that. 0:14:21 So, like, if you’re not quite getting the exact output you want 0:14:22 by typing in the text, 0:14:25 this is sort of like the next thing you can go try, 0:14:29 just speak it out in the exact way you want it to be heard, 0:14:31 and then it’ll generate it that way. 0:14:34 And so, yeah, you saw, you know, the voices here, 0:14:38 but I do want to show you the library where all of the voices are. 0:14:40 And so this is where, you know, I think you can have the most fun, 0:14:43 just browsing what all the different voices on the platform are. 0:14:46 And we have voice actors as well who’ve now added their voice. 0:14:53 So one of my favorites is Carter who actually did a lot of the voices 0:14:55 for Mortal Kombat and Street Fighter. 0:14:58 So Baraka and like all these characters that you remember, yeah. 0:15:02 I’m an 80s kid, so yeah, definitely. 0:15:04 Yeah, I love Mortal Kombat. 0:15:07 So for me to see his voice on the platform is super cool. 0:15:08 You can hear it. 0:15:11 I’ve failed over and over and over. 0:15:12 So his voice Shao Kahn, 0:15:15 and this is so much like that Shao Kahn voice, right? 0:15:17 Oh my God, I got to use that for a video game. 0:15:21 That’s got to be like the voice in the game, like the narrator. 0:15:23 Yeah, 100%. 0:15:25 And we even have a singing voice. 0:15:30 So and this is kind of the cool thing about the replicas, right? 0:15:33 It’s like you can record any sort of recording 0:15:35 and it’ll take that into consideration. 0:15:40 Although I’m almost 30, I feel like a schoolgirl when I think of you. 0:15:46 Maybe that’s the one we should use for our next wave intro. 0:15:47 Yeah, definitely. 0:15:50 That’s super cool. 0:15:54 Now, I know there’s like a sort of like marketplace as well. 0:15:57 Like, do you know Matt Vidpro, the AI YouTuber? 0:16:01 He trained his voice into Eleven Labs 0:16:05 and he put it into some sort of like marketplace too. 0:16:09 And now he says that he keeps on seeing like TikToks 0:16:12 and like Instagram reels and stuff that are using his voice. 0:16:14 And it’s like, it’s been weirding him out. 0:16:16 But like, what’s the whole like marketplace thing? 0:16:18 Yeah, yeah. 0:16:20 So voice actors, when they come to the platform, 0:16:23 they essentially can start earning when people use their voice. 0:16:27 And so we have people earning in the thousands of dollars a month passively, 0:16:29 just not doing anything and other people are using their voice. 0:16:34 And so I think Matt Vid is probably a recipient/victim according to… 0:16:38 Of this popularity, yeah. 0:16:43 But yeah, no, he’ll be earning for all that virality, which is cool. 0:16:46 Are there any limitations on how his voice can be used? 0:16:50 Like, can it be used for selling hymns or something like this? 0:16:54 Yeah, there are people who have… 0:16:57 You know, they can keep their voice on the platform 0:16:58 for a specific period of time. 0:16:59 They can pull it back. 0:17:00 So he does have that control. 0:17:02 So that’s super cool. 0:17:05 Yeah, it’s like a, you know, if you want a little passive income stream, 0:17:09 and you have a decent sounding voice, 0:17:11 go generate or train your voice in there, 0:17:13 and you can sort of sell it on the platform. 0:17:14 That’s pretty cool. 0:17:17 Totally. And, you know, it’s not just one voice. 0:17:18 You might be able to do multiple voices. 0:17:22 So Carter here does his, like, you know, video game style character, 0:17:24 but he also has a casual conversation on. 0:17:29 This man was attacked by a shark, but incredibly, this attack saved his life. 0:17:30 And so you can see that he’s… 0:17:32 Wow, great brain. 0:17:36 We all know that regular exercise is good for the body. 0:17:37 A modern man’s man, Val… 0:17:39 And so he’s got different voices. 0:17:40 They’re all earning different ways. 0:17:42 It’s, you can, yeah, if you’re a voice actor, 0:17:44 I’m sure you can do a range of voices. 0:17:45 So, yeah. 0:17:45 Yeah, for sure. 0:17:46 Yeah, no. 0:17:48 That’s super cool. 0:17:49 Now, what’s the main use case right now? 0:17:50 Do you, is it, is it people doing, like, 0:17:52 faceless YouTube channels and stuff like that? 0:17:54 Or, like, what’s, like, what’s the main way 0:17:56 people are actually using Eleven Labs? 0:17:57 It’s a range, honestly. 0:17:59 They’re, the faceless YouTube channels are huge. 0:18:02 I was just watching a cricket game yesterday, 0:18:06 and the ads in between, I could recognize the voice. 0:18:07 I was like, that’s Brian. 0:18:12 And so there’s advertisements that it’s being used for. 0:18:15 And then, yeah, I think a lot of these, like, 0:18:19 video game things are now, are using them a lot, too. 0:18:22 Yeah, I’ve actually started even using the sound effects 0:18:23 feature a little bit more, too. 0:18:26 Like, you can have, like, a, you know, a knock on the door, 0:18:30 or, like, a loud crash, or, like, a, you know, an explosion, 0:18:33 or things like that, and just generate real quick sound effects. 0:18:36 And you don’t even need to go hunt them down on, you know, 0:18:37 stock sound effects sites anymore. 0:18:39 Just go to Eleven Labs, tell it what you want, 0:18:40 and generate it. 0:18:43 And now I’m starting to sound like a pitchman for Eleven Labs, 0:18:45 but, like, I legitimately do use it, so. 0:18:49 Like, how many languages work, too? 0:18:51 Like, is it good, like, going from one language to another yet, 0:18:53 or, like, where is that, where is that at? 0:18:54 Yeah, yeah. 0:18:56 So it goes between 32 different languages. 0:18:59 And yeah, you can, you can generate that same thing, 0:19:02 that text we had here in German. 0:19:03 Let’s actually do that. 0:19:04 Why not? 0:19:06 Yeah, that feels like probably the biggest unlock 0:19:08 that people haven’t really wrapped their heads around yet, 0:19:11 is the fact that you could produce content in one language, 0:19:14 and then make it, like, available in 33 different languages. 0:19:16 So it handles the translation as well. 0:19:18 You can type out a sentence in English, 0:19:20 and it’ll, like, translate it to Japanese, 0:19:23 and then speak it out in your voice in Japanese as well. 0:19:26 So you would have to do the translation yourself. 0:19:26 Okay. 0:19:30 But yeah, you will, you will get the effect. 0:19:31 Oh, I’m so weak. 0:19:34 We’ll come in some next wave podcast 0:19:38 with Matt Wolf and Nathan Lanz. 0:19:42 You were saying trailer boys, but now in German, yeah. 0:19:45 My name is cool in German. 0:19:50 I like Matt Wolf. 0:19:56 I mean, my last name is German anyway, so. 0:19:58 And yeah, on the sound effects front, 0:20:00 let’s just, I think I was trying to do this 0:20:02 for a video game I was making. 0:20:04 And let’s see, let’s hear what sound it makes. 0:20:09 Oh, yeah, that’s more like a barrier. 0:20:10 Yeah. 0:20:11 What’s the prompt? 0:20:13 Just for anybody who might just be listening on audio? 0:20:16 Yeah, so it’s a retro video game click sound. 0:20:17 Okay, got this one. 0:20:22 Yeah, and you kind of remind me of those classic 8-bit games. 0:20:24 Yeah, like in the menu or whatever, 0:20:25 you’re picking something in there. 0:20:27 Yeah, it was, like, moving through the menu. 0:20:29 Let’s hear a car whizzing by. 0:20:30 Let’s see what that sounds like. 0:20:37 Really fast car. 0:20:42 But yeah, the cool thing here, though, 0:20:44 is that exactly what you said, 0:20:46 it’s like instead of spending hours searching 0:20:49 for that sound effect, just describe what’s in your head 0:20:51 and then hopefully you get a sound close to it 0:20:53 and we send you multiple samples 0:20:55 so you can hopefully get there pretty quick. 0:20:56 Yeah, yeah, for sure. 0:20:59 Is there anything else in the desktop app 0:21:02 that we haven’t covered on this yet? 0:21:06 My favorite one, which is the latest thing we’ve got 0:21:08 that’s come out recently, is conversational AI. 0:21:12 So we’re literally letting anyone build conversational AI agents 0:21:15 that they can talk to and try out. 0:21:18 And that just, the API is super simple too, 0:21:20 so anyone can literally start building 0:21:23 their own conversational experiences. 0:21:26 One thing I made was a little assistant 0:21:27 for our Wiki internally. 0:21:29 We have all this stuff about our offices 0:21:30 and all that stuff. 0:21:33 And it’s like, ah, do I have to go read through this Wiki? 0:21:35 Why can’t I just ask this agent? 0:21:38 Hey, what’s the Wi-Fi password in the London office, right? 0:21:40 And it’ll tell me that. 0:21:43 And so it’s as easy as like you go in, 0:21:47 you describe what it is, you give it a system prompt, 0:21:49 and you have all these other settings 0:21:50 if you really want to play with them. 0:21:53 You can choose the LLM. 0:21:56 For instance, I can go to Gemini Flash or GP4 or Mini. 0:22:00 And then here you can see I’ve got all of the Wiki 0:22:01 as a knowledge base for it. 0:22:05 So it knows what it’s basing off the answers. 0:22:07 And then we can talk to it. 0:22:11 Yeah, it’s almost like building like a custom GPT, 0:22:14 like the actual process of building it. 0:22:15 And I actually played with that. 0:22:16 I actually forgot about that feature 0:22:17 until you just brought it up again. 0:22:20 But I played with it in like a video either last week 0:22:22 or the week before, whenever it came out. 0:22:24 It was really quite cool. 0:22:27 Now, when you do build one of these, two questions. 0:22:29 Is there a limit to how much like knowledge 0:22:31 you can put into it? 0:22:34 And two, can I like embed it on my website 0:22:35 or something like that so other people 0:22:37 can have conversations with it outside of 11 Labs? 0:22:40 Yeah, so it’s working with the context window 0:22:41 of the LLM that you’re using. 0:22:44 So there’s a bit of that that limits that. 0:22:46 And then, yeah, you can totally embed it. 0:22:48 So we let you easily like copy out a widget. 0:22:51 You can customize it in the dashboard. 0:22:53 And then you literally just paste in this one line 0:22:55 and you’ll get your widget everywhere, 0:22:55 which is pretty cool. 0:22:57 That’d be super cool to make it like something like a– 0:23:00 instead of having a frequently asked question 0:23:01 for a company or something, you’ve got like a little widget. 0:23:04 And it’s like, here’s a little magic genie 0:23:04 that you just talked to or whatever. 0:23:08 Magic assistant that you talked to 0:23:09 and it’ll answer whatever. 0:23:10 Right, that’s so cool. 0:23:11 100%. 0:23:12 And the other cool thing is you can actually 0:23:15 give it success criteria. 0:23:17 And so you’ll know later on in the logs, 0:23:20 like, hey, when someone spoke to this thing, 0:23:22 did they find the thing that they were looking for? 0:23:26 And, you know, I can define it as it’s as simple as a prompt. 0:23:28 It’s like, if the question of the user has answered successfully, 0:23:29 then you helped. 0:23:31 You know, if they were like, hey, cool, like, 0:23:33 I got what I needed, that counts as success. 0:23:37 But yeah, I can give you a quick demo of what it sounds like. 0:23:40 The voice I picked out for this assistant 0:23:42 was a more sci-fi sounding robot 0:23:44 because I wanted it to feel like a classic one. 0:23:45 So it sounds like this. 0:23:47 Hello, I’m Ava. 0:23:50 We got the last big robotic voice. 0:23:52 She need the hologram now. 0:23:53 Yeah, exactly. 0:23:56 I was thinking of Cortana from Halo. 0:23:57 Yeah, that’s the vibe. 0:24:00 But let me talk to it. 0:24:01 Let’s see. 0:24:02 Hi, I’m Elle. 0:24:05 Hey, Elle, how’s it going? 0:24:07 Where is the London office? 0:24:11 The London office is located at Floor 5, 0:24:17 119 Wardore Street, London, W1F0UW. 0:24:20 If you need any more information about the office 0:24:23 or anything else, feel free to ask. 0:24:23 That’s very cool. 0:24:26 Tweet, yeah. 0:24:30 And I will say the speed before you comment on the speed 0:24:31 is the LLM also. 0:24:33 Okay, yeah, yeah. 0:24:36 Yeah, I think it wasn’t like Gemini Flash, 0:24:38 one of the options and Gemini Flash is pretty fast. 0:24:40 Yeah, it’s really fast. 0:24:42 That’s the one we kind of recommend because of the speed. 0:24:46 Though I find if you want a good balance of intelligence 0:24:50 plus the speed, then I think 4.0 is a good middle one. 0:24:51 Sweet. 0:24:52 No, that’s really cool. 0:24:53 I mean, hearing that voice, 0:24:56 having a conversation with that voice was pretty surreal though. 0:24:59 Yeah, yeah, yeah. 0:25:00 It’s really fun. 0:25:01 You can just kind of make your own characters 0:25:03 and embed them in different places. 0:25:07 And then yeah, the last thing I’ll show you guys is projects. 0:25:11 This is what we use for a lot of the folks who are publishing 0:25:14 and turning their books into audiobooks 0:25:16 or people who actually want their podcast transcript 0:25:19 maybe also regenerated and want to try something else. 0:25:22 So projects essentially let’s you kind of go line by line 0:25:26 and even choose different voices for different segments. 0:25:28 So it’s a really powerful editor for that long form stuff 0:25:31 that people might want to do. 0:25:32 And you can do that in your own voice, right? 0:25:34 So you could train it on your own voice. 0:25:36 Because I guess one big gripe I have about audiobooks 0:25:38 is like when you’re the original author, 0:25:38 it’s so much better. 0:25:40 But when it’s somebody else, it’s like, whatever. 0:25:45 A lot of people are starting to use 11 Labs for audiobooks, 0:25:45 it seems like. 0:25:50 I think like in fact, I think like Amazon might even be letting 0:25:54 people use 11 Labs for like audible books now. 0:25:55 I’m pretty sure. 0:25:58 So it’s like you are starting to see more and more authors 0:26:01 just use tools like this and plug in their entire book. 0:26:02 Yeah, exactly. 0:26:05 And that’s also a big part of why we were excited 0:26:06 about doing the mobile app. 0:26:10 It’s like we’re giving all these indie authors 0:26:12 a way to self publish their content. 0:26:14 And now they can use our whole suite. 0:26:15 So it’s like they use projects. 0:26:17 They publish directly to our mobile app. 0:26:19 And then they reach a whole new audience. 0:26:23 And so it’s kind of fun to see the tools finally coming 0:26:26 together and connecting across different mediums as well, 0:26:28 which has been a fun evolution. 0:26:30 Yeah, I do wonder long term how that’s going to work out. 0:26:33 It feels like all these are going to kind of start to combine. 0:26:36 Are you guys competing with Suno? 0:26:38 Or like there’s runway in AI video, 0:26:40 but now they’ve got something with steals 0:26:41 where they’re showing images, right? 0:26:43 Like competing with Mid Journey. 0:26:44 And Mid Journey is working on AI video. 0:26:47 And at some point, they’re all going to want audio as well. 0:26:47 Right? 0:26:51 And so it’s like, how does this all play out long term? 0:26:51 It’s so true. 0:26:55 I think we’re seeing the convergence of all the different, 0:26:58 you know, even the Luma’s recent thing, 0:27:00 where it’s like now it’s a canvas for creativity 0:27:02 and it has for you and photos and stuff. 0:27:03 I think the consumer wins. 0:27:04 That’s for sure. 0:27:06 We’re having a great time with all these tools. 0:27:07 For sure. 0:27:08 Let’s talk about the app a little bit too, 0:27:12 because I know the app’s got the reader in it, 0:27:13 which we were talking about a little bit, 0:27:16 that’ll read PDFs to you or read articles 0:27:17 or things like that to you. 0:27:19 You can have Jerry Garcia read it to you. 0:27:22 Or was Bert Reynolds, I think it was one of them, right? 0:27:24 Yeah, that was my favorite. 0:27:27 So you’ve got like all of these options 0:27:28 for who can read it to you. 0:27:32 And then now the newest feature is the Gen FM, 0:27:35 where, you know, similar to what they did with the Notebook LM, 0:27:37 it’ll do podcasts, but you know, 0:27:38 it’s got like new voices. 0:27:42 And can you use the new Gen FM 0:27:45 with like different voices or is it set voices right now? 0:27:49 Yeah, so right now we’ve kind of curated pairs. 0:27:52 And so we’ve kind of got some voices 0:27:53 are really great for tech content. 0:27:55 Others are great for politics or studying or whatever. 0:27:57 And based off that content, 0:27:59 it’ll pick out your co-hosts. 0:28:02 But we will add more customization. 0:28:03 That’s a given. 0:28:05 That’s kind of what we’re really excited about. 0:28:07 We even played around with internally, 0:28:10 like what does it look like when, you know, 0:28:12 Deepak Chopra is trying to understand the recipe 0:28:13 for chicken marsala. 0:28:17 It was amazing. 0:28:21 Or do like a podcast of Deepak Chopra 0:28:24 trying to understand like Gen Alpha slang or something? 0:28:26 Sure, you know, exactly. 0:28:27 I want to hear that. 0:28:31 So yeah, we’re loving the direction it can take 0:28:32 with all the voices, for sure. 0:28:33 Very cool. 0:28:34 All right, cool. 0:28:37 So the mobile app, yep. 0:28:38 As you can see here on the homepage, 0:28:40 we’ve recently got Gen FM, 0:28:42 and you’ll see it on your homepage right here 0:28:45 with the two co-hosts kind of flying around your screen. 0:28:49 But you go into it, you import an article. 0:28:52 So I’m just pasting in an article that I want to hear. 0:28:55 And I’m just going to hit generate. 0:29:02 And so what was fun about this? 0:29:08 Is we took this state that’s kind of boring, right? 0:29:10 Like you’re waiting for your podcast to load 0:29:13 and we turned it into a fun, interactive moment. 0:29:15 It’s like your co-host are showing up. 0:29:18 We’re getting them ready in the room. 0:29:22 You’re not just sitting there, yeah. 0:29:24 Yeah, yeah, it’s not just sitting there. 0:29:26 And then when your podcast is ready, 0:29:28 it kind of sounds something like this. 0:29:30 So let’s hear them. 0:29:32 Let’s hear them. 0:29:34 Counter-intuitive. 0:29:35 Underreacting. 0:29:37 A superpower for the modern age? 0:29:39 Well, not quite. 0:29:41 Today we’re discussing how learning to underreact 0:29:44 may actually be the key to making a real difference 0:29:46 in our increasingly chaotic world. 0:29:49 That’s an intriguing concept. 0:29:52 In a world that often feels like it’s spinning out of control, 0:29:55 the idea of underreacting might seem counter-intuitive. 0:29:57 Can you elaborate on what you mean by that? 0:29:59 Of course. 0:30:02 You see, I used to be the kind of person who would rush in. 0:30:05 And so you can see, you get a little bit of a discussion. 0:30:07 And you know, it’s been so fun 0:30:09 because I’ve tried it out on all sorts of content. 0:30:15 For instance, my dad had written this essay about parenthood 0:30:19 and I then played it back to him, 0:30:22 but with two AI co-hosts discussing his essay. 0:30:24 And it was like a completely different perspective. 0:30:27 And he was like, wow, it feels so… 0:30:28 Wow, this is genius. 0:30:34 He either felt like a genius or like he was being judged. 0:30:37 But it’s amazing, right? 0:30:41 It’s such a fun way to hear an entirely new perspective 0:30:42 on a piece of content. 0:30:46 And it can also combine a whole bunch of pieces of content. 0:30:49 So if I wanted to do a morning news brief, 0:30:53 I can go to all of the various websites that I get my AI news from, 0:30:54 toss them all into the Reader app, 0:30:57 and it’ll make a podcast that rounds up 0:30:59 all the news for that day for me, right? 0:31:00 Yeah, yeah, exactly. 0:31:01 You can get all sorts of… 0:31:03 You just give it a bunch of content. 0:31:07 We will then take out insights and other perspectives 0:31:08 you might not have gleaned from it. 0:31:12 And then to the extent that we can with the LLM’s knowledge, 0:31:15 of course, like how does it tie to other events 0:31:16 you might not have known about? 0:31:19 And then that brings out a cool insight as well. 0:31:21 So yeah, it’s been fun. 0:31:24 And I’ll tell you, the team really scrambled for this one. 0:31:26 We thought about the idea and we were like, 0:31:28 “Oh, we gotta make it happen.” 0:31:32 And then we just started going at it a week and a weekend, 0:31:33 and then it was real. 0:31:37 And then we just kept fine-tuning it till we got to this point. 0:31:41 Awesome. Yeah, it’s a super cool app. 0:31:42 I have one question about it. 0:31:42 Can you actually… 0:31:45 I actually have not played with the Gen FM. 0:31:47 I think the day we’re recording this is either 0:31:49 the day it came out or the day after it came out. 0:31:52 So it’s really fresh as the day that we’re recording this. 0:31:55 But can you actually download the podcast episode 0:31:57 when you’re done, like get like an MP3 version of it 0:31:58 or something like that? 0:32:01 Yeah, so we don’t have full episode downloads yet, 0:32:03 but you can share them with your friends 0:32:04 and that will send them… 0:32:07 I think about, well, we love number 11, 0:32:10 so it’s a one-minute, 11-second clip that you can send. 0:32:14 Obviously, it’s similar to notebook LLM, right? 0:32:16 So I’m curious, like right now, 0:32:17 what’s the main difference? 0:32:19 Is it like you guys have a lot more choices of voices, 0:32:20 I assume? Is that one… 0:32:25 Yeah, I think there are a few things that we think 0:32:27 make Gen FM stand out. 0:32:29 And I think one is the voice library for sure. 0:32:31 It’s like really realistic voices 0:32:34 and you have a whole range of potential co-hosts. 0:32:36 The other thing is the languages piece. 0:32:36 You know, that’s… 0:32:40 We’re still staying true to that, which is 32 languages. 0:32:42 So anyone can have all sorts of different podcasts 0:32:44 and different languages, which we’re excited about. 0:32:46 And then I think a mobile first experience, 0:32:49 because this is something where I don’t necessarily 0:32:51 want to pull up like a notebook style page 0:32:53 and like listen to a podcast there. 0:32:56 I want to hear it on my phone while I’m on the commute 0:32:56 or something like that. 0:33:00 And we think this is a more natural way to do it. 0:33:02 Yeah, I mean, we recently did a tier list for AI. 0:33:04 And we did put 11 Labs in A, 0:33:07 and we did put notebook LLM after debating it in B. 0:33:10 And the big reason was, or was it B or C? 0:33:10 Maybe I think it was B. 0:33:14 And the reason was, you know, I’m highly skeptical 0:33:16 that Google can actually turn that into a product 0:33:18 that people actually use. 0:33:20 And I think 11 Labs, honestly, has a better chance, 0:33:22 you know, with the whole suite of things 0:33:23 that you guys have people using, 0:33:24 it seems more natural that, yeah, 0:33:26 it goes into 33 different languages. 0:33:27 That’s great. 0:33:29 You turn a podcast into 33 different languages. 0:33:32 That’s an actual use case that people could do today 0:33:35 and make money from that, versus with the Google product, 0:33:37 with notebook LLM, I’m not sure right now 0:33:39 what you would actually use it for in terms of business. 0:33:41 Yeah, yeah, I think I’m clear. 0:33:43 I see the studying use case. 0:33:44 I think it’s great. 0:33:48 And I think it’s a fun way to explore the content. 0:33:51 And also, kudos to them for really showing people 0:33:53 this was a fun new way to look at it. 0:33:58 But yeah, we’re absolutely excited to make something great. 0:34:01 Well, I don’t know if you can answer my next question or not. 0:34:02 You may not be allowed to, 0:34:05 but is there anything exciting that’s upcoming 0:34:08 for 11 Labs that maybe you could tease for us? 0:34:09 Or that you can break on the show. 0:34:15 Well, I think we teased Music way early on. 0:34:19 And we’re excited that that model is really coming together 0:34:20 and sounding great. 0:34:23 So I think that’ll be one of the next things 0:34:24 you guys will see very soon. 0:34:26 Interesting. 0:34:30 And then, of course, the other thing we’re always thinking about 0:34:35 is how does all these different parts of audio come together, 0:34:38 music, sound, speech, and so on. 0:34:40 So things are being explored. 0:34:42 You know, I’ll wait for the right moment. 0:34:44 I think the right teammates, 0:34:47 the research team who’s really cranking and doing all of this 0:34:48 and deserve all the credit, honestly. 0:34:53 Yeah, I’m waiting for them to feel right, 0:34:54 reveal to the world what they’ve been cooking. 0:34:55 And it’s amazing. 0:34:56 Awesome. 0:34:58 Now, this has been really cool. 0:34:59 I appreciate you jumping in. 0:35:02 I know you’ve got a really great X account. 0:35:02 I follow you on X. 0:35:04 You share a lot of really cool stuff about 11 Labs, 0:35:07 but also not always just 11 Labs, right? 0:35:09 It’s not a pure 11 Labs pitch account, 0:35:11 but you share some really cool stuff, 0:35:13 a lot of stuff you’re experimenting with. 0:35:14 You build a lot of apps. 0:35:16 You’ve been doing a lot of AI coding stuff. 0:35:18 In fact, we’re talking about doing a follow-up episode 0:35:20 where we break down some of the cool apps 0:35:22 that you’ve been working on and how you’ve built them. 0:35:25 So definitely everybody needs to follow Amar over on X. 0:35:28 Do you want to go ahead and shout out your X account? 0:35:31 I always feel awkward when I’m shouting on my own account, 0:35:34 but yeah, I’m at Amar on X. 0:35:37 Yeah, A-M-M-A-A-R. 0:35:39 Because I know how badly the priestess has spelled my name. 0:35:43 The second A keeps throwing me off. 0:35:46 I’m like, “Get the real thing, get the real thing.” 0:35:49 I think to what you said, 0:35:54 as someone who doesn’t really code and finds it hard to code 0:35:56 and to be able to just talk to AI now 0:35:58 and to build so many different apps, 0:36:01 it’s like, this is the time I keep telling people 0:36:03 where you can be the idea person who ships. 0:36:04 I appreciate it. 0:36:05 Well, thanks so much, Amar. 0:36:07 This has been an awesome conversation. 0:36:09 And yeah, thanks again for hanging out with us. 0:36:12 And anybody tuning in, 0:36:14 make sure that you subscribe to this podcast 0:36:15 wherever you listen to podcasts. 0:36:17 If you’re watching on YouTube, subscribe to us there. 0:36:19 If you’re Spotify, Apple Podcast, 0:36:22 wherever you listen to podcasts, you can find this show. 0:36:23 Please subscribe to us. 0:36:24 Really appreciate you. 0:36:25 We’ll see you in the next one. 0:36:25 Bye-bye. 0:36:28 (upbeat music) 0:36:31 (upbeat music) 0:36:33 (upbeat music) 0:36:36 (upbeat music) 0:36:38 (upbeat music) 0:36:41 (upbeat music) 0:36:43 you 0:36:45 you
Episode 38: How revolutionary is the latest in AI voice technology? Matt Wolfe (https://x.com/mreflow) and Nathan Lands (https://x.com/NathanLands) dive deep into this topic with Ammaar Reshi (https://x.com/ammaar), head of design at ElevenLabs and AI enthusiast who has made waves with his innovative AI projects.
In this episode, Ammaar takes us through the cutting-edge features of ElevenLabs, a platform revolutionizing content creation with AI-driven voice technology. From monetizing pre-recorded voices to producing multilingual content, and even generating music, explore how ElevenLabs is transforming how we create and consume audio content. They also delve into Ammaar’s background, discussing his transition from viral AI art to leading design at ElevenLabs, and the exciting developments on the horizon for AI in audio.
Check out The Next Wave YouTube Channel if you want to see Matt and Nathan on screen: https://lnk.to/thenextwavepd
—
Show Notes:
(00:00) Discussing AI business tool with ElevenLabs.
(05:28) Co-founders initiated dubbing innovation for accessibility.
(07:52) Exploring ElevenLabs features, including iPhone app.
(10:47) Stability affects voice similarity and style.
(13:49) Browse library of diverse platform voice actors.
(17:37) Using ElevenLabs for quick sound effects.
(20:21) Anyone can build simple conversational AI agents.
(25:20) Mobile app empowers indie authors for self-publishing.
(31:40) GenFM: Realistic voices, 32 languages, mobile experience.
The Next Wave is a HubSpot Original Podcast // Brought to you by The HubSpot Podcast Network // Production by Darren Clarke // Editing by Ezra Bakker Trupiano
Google Flew Me To London To Test Project Astra (Live AI Demo)
AI transcript
🕒
vi
0:00:03 (upbeat music) 0:00:05 When you think of the most iconic logo in the world, 0:00:06 which brand comes to mind? 0:00:08 For me, it’s probably Nike. 0:00:10 Their former CMO Greg Hoffman knew exactly
In this episode, Matt and Nathan dive deep into their first-hand experiences with Project Astra in London. They discuss the groundbreaking capabilities of Astra, including multimodal conversations, advanced personalization, and potential to perform complex tasks. They also touch on the progress of Gemini models, Google’s approach to data privacy and security, and the fierce competition in the AI industry. Tune in to hear how Astra could redefine digital assistants and the implications this has for the future.
Check out The Next Wave YouTube Channel if you want to see Matt and Nathan on screen: https://lnk.to/thenextwavepd
—
Show Notes:
(00:00) Google DeepMind’s Project Astra: Multimodal AI Assistant.
(03:28) Visited DeepMind in London; experienced glowing presentations.
(06:45) Provides information on nearby attractions and objects.
(11:42) Launch timing uncertain; testing and satisfaction pending.
(12:43) Interviewed Project Astra leads on features, privacy.
(17:00) Metacognition requires experiential learning for accuracy.
The Next Wave is a HubSpot Original Podcast // Brought to you by The HubSpot Podcast Network // Production by Darren Clarke // Editing by Ezra Bakker Trupiano
0:00:01 I don’t know if you’ve heard about this one yet, 0:00:03 but Sora got leaked. 0:00:07 This actually probably benefited OpenAI more than anything, 0:00:09 which I find really, really fascinating. 0:00:11 – Yeah, it’s interesting to see like OpenAI 0:00:13 really go about product development differently. 0:00:15 – Yeah, it looked the most realistic and consistent to me 0:00:17 out of anything I’ve seen. 0:00:19 – And so if they were better and from the turbo model, 0:00:20 that’s quite impressive. 0:00:23 (upbeat music) 0:00:25 – Hey, welcome to the Next Wave podcast. 0:00:27 I’m Matt Wolf, I’m here with Nathan Lanz, 0:00:31 and there has been a lot of craziness in the AI world 0:00:34 over the last few weeks, a lot of big things happening. 0:00:36 And in this episode, 0:00:37 we’re gonna just break it all down for you, 0:00:39 share our thoughts and opinions, 0:00:41 and let’s just go ahead and dive right into it. 0:00:44 (upbeat music) 0:00:46 – When all your marketing team does is put out fires, 0:00:47 they burn out fast. 0:00:50 Sifting through leads, creating content for infinite channels, 0:00:54 endlessly searching for disparate performance KPIs, 0:00:55 it all takes a toll. 0:00:59 But with HubSpot, you can stop team burnout in its tracks. 0:01:01 Plus, your team can achieve their best results 0:01:02 without breaking a sweat. 0:01:05 With HubSpot’s collection of AI tools, 0:01:08 Breeze, you can pinpoint the best leads possible, 0:01:11 capture prospects attention with click-worthy content, 0:01:14 and access all your company’s data in one place. 0:01:16 No sifting through tabs necessary. 0:01:19 It’s all waiting for your team in HubSpot. 0:01:20 Keep your marketers cool 0:01:22 and make your campaign results hotter than ever. 0:01:25 Visit hubspot.com/marketers to learn more. 0:01:31 – We’re going to set the stage 0:01:33 and tell you a little bit about what’s been going on 0:01:36 in the world of AI over the last couple of weeks. 0:01:38 Last year around November, 0:01:40 AI news just sort of died off, right? 0:01:41 When you get around the holidays, 0:01:43 you don’t really get a ton of news 0:01:45 ’cause even all the AI companies 0:01:47 sort of take the time off as well. 0:01:50 The biggest news that came out of November of 2023 0:01:53 was Sam Altman getting booted from open AI 0:01:55 and then a few days later coming back to open AI. 0:01:56 But other than that, 0:01:58 there was really nothing going on in November 0:01:59 around the world of AI. 0:02:02 That doesn’t feel like the case this year in November. 0:02:04 It seems like there’s really been 0:02:06 not much of a slowdown at all. 0:02:07 Like there’s a lot to talk about. 0:02:10 We wanted to start with a very interesting thing 0:02:12 that happened with open AI and Sora 0:02:15 because Sora got leaked. 0:02:16 And I use leaked in air quotes 0:02:18 ’cause I don’t really feel like it got leaked. 0:02:21 Like somebody made a Python script essentially 0:02:23 that linked to Sora’s API. 0:02:25 And a bunch of people were able to basically 0:02:27 generate videos really quickly 0:02:31 while it was online for like the two hours that it was online. 0:02:34 So basically a group of like artists 0:02:36 that got early access to Sora, 0:02:39 at least according to this like write up that they did here, 0:02:41 they got early access to Sora, 0:02:43 but they were frustrated by the fact 0:02:45 that open AI was telling them 0:02:47 what videos they could share with the world, 0:02:49 what videos they couldn’t share with the world. 0:02:53 And they felt like open AI was taking advantage of them. 0:02:56 They basically said that you gave us access to this stuff, 0:02:57 you let us play with it. 0:02:59 But if we want to publish a video, 0:03:01 it’s only videos that open AI essentially cherry picks 0:03:03 and lets us put out. 0:03:04 We’re here testing it for you. 0:03:06 We’re red teaming it for you. 0:03:08 We’re doing all this work for you for free. 0:03:12 And all we’re getting in return is access to Sora, 0:03:14 which we can’t share anything we make 0:03:15 without your permission first. 0:03:18 Therefore, that’s why we’re leaking it. 0:03:20 But again, the leak was really just like, 0:03:23 they went on hugging face, they put up a Python file, 0:03:26 the Python file had an API endpoint that pointed to Sora, 0:03:29 which means people were able to go on hugging face 0:03:31 for like a two or three hour window, 0:03:34 generate videos with Sora and, you know, 0:03:36 kind of share them around. 0:03:37 But then pretty quickly, 0:03:39 because they were just sharing an API endpoint, 0:03:42 open AI was able to go and shut down that endpoint 0:03:43 and nobody was able to generate videos anymore. 0:03:47 So the loophole was closed pretty, pretty quickly. 0:03:52 But in my opinion, this had like the reverse effect 0:03:55 of what these people that leaked it were going for, right? 0:03:58 Because they leaked it. 0:03:59 It didn’t stay online for very long. 0:04:02 So not many people got access to play with Sora, 0:04:05 but everybody’s talking about Sora again, right? 0:04:07 So it’s like, this actually probably 0:04:09 benefited open AI more than anything. 0:04:11 – Actually, I saw a lot more negative sentiment 0:04:13 towards the leaker than I would expect it. 0:04:15 I kind of expected to be kind of like both sides, 0:04:17 kind of stuff, at least on X. 0:04:18 I didn’t see that. 0:04:19 It was all like pretty much one-sided. 0:04:20 Like this is kind of petty. 0:04:23 Like they give you access to this really cool tool, 0:04:25 which you could have personally and professionally benefited 0:04:27 from by having early access. 0:04:30 And you decided to do what? 0:04:32 Like do it in the middle finger 0:04:35 and like illegally leak the videos. 0:04:36 – Yeah, and they were all under NDA too. 0:04:37 So they all broke. 0:04:40 So whoever leaked this broke NDA to do it. 0:04:43 And then basically open AI shut down 0:04:45 the Sora like creator program. 0:04:48 So all of the creators that had access to Sora 0:04:51 that were able to use it behind the scenes all lost access. 0:04:54 No, but it’s like, all right, since somebody leaked it, 0:04:56 we don’t know the exact source of the leak. 0:04:57 We’re closing it for everybody. 0:04:59 Nobody gets to use Sora anymore, right? 0:05:02 So it’s like all of the creators 0:05:03 that got access behind the scenes, 0:05:05 most of them were frustrated that this happened 0:05:08 ’cause they lost access to Sora for themselves. 0:05:10 And yeah, it does feel kind of petty. 0:05:13 I mean, you got access to this tool. 0:05:16 I think more of what the negative sentiment around it 0:05:19 was that open AI was trying to super strongly control 0:05:21 the narrative of like what gets shared. 0:05:23 So like all the videos you’re seeing 0:05:25 are only the cherry-picked videos. 0:05:28 And this leak kind of showed that 0:05:29 Sora has a lot of the same flaws 0:05:32 that some of the other video generators have, right? 0:05:34 But at the end of the day, 0:05:35 we’re on a podcast talking about Sora 0:05:37 and we hadn’t talked about Sora in months 0:05:38 ’cause it’s been irrelevant. 0:05:41 Well, now all of a sudden Sora is relevant again, so. 0:05:42 – Oh yeah, that was a conspiracy theory, right? 0:05:43 It’s like they did it on purpose, 0:05:46 but I’m like, I just don’t really believe that. 0:05:48 I don’t see why they would do that, 0:05:49 but maybe you believe that, I don’t know. 0:05:52 – No, I do not believe that this was open AI, 0:05:56 like leaking Sora to get people talking about it again. 0:05:57 That wouldn’t really make any sense. 0:05:59 I also think, you know, 0:06:01 open AI probably lost a little bit of money. 0:06:02 I mean, in the grand scheme of things, 0:06:03 probably not too much money, 0:06:05 but like, you know, making an open, 0:06:08 like putting this API endpoint on the web 0:06:09 where anybody could use it for hugging face, 0:06:12 I don’t know what it cost to generate one of these videos, 0:06:14 but people were able to do a whole bunch of them 0:06:16 for free in this small window that it was open, right? 0:06:20 So it’s like, it definitely had a slight negative impact. 0:06:23 – Yeah, one thing worth noting is I think someone showed 0:06:25 from the API calls that it was actually a model, 0:06:28 a turbo model, which likely means 0:06:32 that it was a slightly lower quality, faster model. 0:06:34 So I would say that if that’s the case, 0:06:37 all the videos that I saw, the examples, 0:06:40 they were not dramatically better than the models out there 0:06:41 that I’ve seen, like from Runway and others, 0:06:43 but they were better. 0:06:44 And so if they were better and the turbo 0:06:47 from the turbo model, that’s quite impressive. 0:06:50 – I mean, I think when it comes to people, 0:06:52 I think it is probably the best model 0:06:54 for generating people like walking. 0:06:56 It looked the most realistic and consistent to me 0:06:57 out of anything I’ve seen, you know, 0:06:59 that already exists out there. 0:07:01 So, and I agree that the turbo model, 0:07:03 supposedly it’s a lot faster to generate. 0:07:06 It can only generate up to like one minute videos total. 0:07:08 Maybe it was only 10 seconds, I don’t know. 0:07:10 But it had like a shorter generation time 0:07:13 than like the main Sora model. 0:07:14 But yeah, at the end of the day, 0:07:18 I don’t think it was actually smart by the leaker. 0:07:21 It pissed off more people than like, 0:07:23 than people actually liked it. 0:07:25 I think the only people that are really happy about it 0:07:28 were like the, you know, the AI art community 0:07:31 that is super anti-AI, right? 0:07:32 There’s been a few new announcements 0:07:34 that came out of Anthropic this week. 0:07:39 They introduced their new model context protocol, 0:07:42 says today we’re open sourcing the model context protocol, 0:07:44 a new standard for connecting AI assistance 0:07:46 to the systems where data lives, 0:07:48 including content repositories, 0:07:50 business tools and development environments. 0:07:52 It aims to help frontier models produce 0:07:55 better, more relevant responses. 0:07:58 So kind of what I’m taking away from this 0:08:02 is that it’s sort of like a enterprise feature, I guess, 0:08:04 or like a feature to use like internally 0:08:06 in your company to sort of connect it 0:08:09 to your data sources so that Claude has that context 0:08:12 of your data sources connected to it. 0:08:14 Is that kind of what you’re taking away from this as well? 0:08:16 – Yeah, well, I mean, I haven’t had time 0:08:17 to like really deep dive into it yet, 0:08:21 but like from reading on Reddit and like X on places, 0:08:23 the big takeaway seems to be that, 0:08:24 yeah, you could kind of do this stuff before, 0:08:26 but you need to create a lot of custom software 0:08:29 to connect the LLM to your private databases 0:08:30 and information. 0:08:32 And it sounds like now it’s dramatically easier 0:08:34 ’cause it kind of just takes care of that for you. 0:08:35 – Yeah, I mean, honestly, 0:08:36 this is a little out of my wheelhouse. 0:08:39 It’s not something that like I personally use 0:08:42 or know a whole lot about, you know, 0:08:44 I tend to play more on the creativity side 0:08:47 and the like large language model side of like, 0:08:50 you know, getting it to generate stuff that’s useful for me. 0:08:53 I don’t really have any extra like data stores 0:08:54 I’m trying to connect to right now. 0:08:57 So it’s not something I’ve personally tested. 0:09:00 There is another announcement that came out of Anthropic, 0:09:01 which I have tried. 0:09:04 So I can talk a little bit more intelligently about that. 0:09:06 They just rolled out a new feature 0:09:09 where you can create your own personal style. 0:09:12 And so what this is, is if you go into Claude, 0:09:15 there’s a dropdown that says choose style. 0:09:17 And under the choose style, 0:09:19 you have the ability to create a custom style. 0:09:21 And the way you create a custom style 0:09:24 is by uploading a whole bunch of texts that you’ve written. 0:09:27 So if you have written articles or written a book 0:09:30 or have blog posts or even if you take transcripts 0:09:32 from your videos or something like that, 0:09:34 you can load all of this text in there 0:09:37 and it learns your speaking style, 0:09:39 your typing style, whatever you wanna call it. 0:09:41 It basically learns on that text 0:09:43 to try to duplicate your style. 0:09:46 So now when you go to Claude and you give it a prompt, 0:09:49 you can actually select make it sound like me essentially 0:09:51 and the response it’ll give you 0:09:55 will be sort of in the style of how you would write. 0:09:57 Or if you put transcripts, 0:10:00 it would be sort of in the style of how you’d talk, right? 0:10:02 I played with it, it’s really easy to use. 0:10:06 It’s pretty cool, but I didn’t really feel like it sounded 0:10:08 like me when I had it generate. 0:10:11 Like I gave it three of my YouTube video transcripts. 0:10:14 So I basically gave it 90 minutes worth of transcripts 0:10:17 to try to learn how I talk and what it gave back to me 0:10:20 had a whole bunch of like emojis integrated in it. 0:10:23 And it said stuff like, what’s up guys? 0:10:25 I got something killer for you today 0:10:28 that you’re gonna think is really spicy. 0:10:29 And then it had like a rocket emoji. 0:10:31 And it was like, what? 0:10:34 Like this, like there was no emojis 0:10:35 in the transcripts that I uploaded. 0:10:39 Why are there emojis in the style that I got back? 0:10:40 But the cool thing about it 0:10:43 is you can actually sort of continue to fine tune it. 0:10:46 So when it gives you a style back, 0:10:48 you can go and click edit style. 0:10:49 And there’s a little chat window, 0:10:52 just like if you’re using a custom GPT 0:10:53 where you could give it feedback. 0:10:55 So I gave it the feedback like, 0:10:57 I never put emojis in my text. 0:11:01 And like it’s a little more casual than I actually speak. 0:11:04 I speak casually, but this is too casual. 0:11:06 And it actually sort of tweaks it 0:11:09 and tries to get it even closer and closer to your voice. 0:11:10 But right out of the box, the first time I did it, 0:11:13 it was not amazing, 0:11:14 but you can actually sort of continue 0:11:16 to fine tune it a little bit. 0:11:18 – Yeah, it’s interesting to see like anthropic 0:11:21 and open AI really go about product development differently. 0:11:25 Like recently, anthropics been releasing things faster, 0:11:26 but some of them are kind of half-baked, 0:11:29 like the computer use thing and whatnot, right? 0:11:30 Where it’s like, cool, you did it first, 0:11:32 but people already heard that open AI 0:11:35 was building something similar for a while. 0:11:36 And they haven’t released it. 0:11:39 So it’s interesting that they are releasing cool things 0:11:40 that don’t always perfectly work. 0:11:43 And it does seem that open AI is kind of taking the path 0:11:44 of like be more like Apple, 0:11:46 where you try to wait until things are very good. 0:11:48 Same thing when they’re doing a Sora. 0:11:49 Apparently you got a great AI video model, 0:11:50 but they’re waiting to release it 0:11:52 where they feel like it’s actually the right time 0:11:54 when it’s actually super useful for people. 0:11:56 – Yeah, it makes me wonder if they, 0:11:57 like if they’re just kind of taking 0:11:58 two different business approaches, right? 0:12:01 Like anthropic is like, let’s put it out before it’s ready, 0:12:03 but collect a ton of feedback. 0:12:04 Let’s find out what people like about it, 0:12:06 what people don’t like about it. 0:12:08 Like, I mean, that’s been a practice a lot of, 0:12:10 I mean, Microsoft is actually pretty known 0:12:11 for doing that, right? 0:12:14 Like the first version of the Xbox they came out with 0:12:16 like was not, didn’t work very well, 0:12:18 and they waited for feedback 0:12:21 and then like made better versions based on feedback. 0:12:23 So I mean, like the big tech companies 0:12:25 have been doing this kind of thing forever. 0:12:27 Like let’s put something out that we know is not ready yet, 0:12:29 but let’s see what people say about it. 0:12:33 – I ended up meeting Kevin Bacchus from Microsoft 0:12:35 because of the Xbox one thing. 0:12:38 Because I seen him emails about the red ring or whatever. 0:12:40 I got like, I got that like two times in a row. 0:12:42 I was living in Florida and when you buy the Xbox 0:12:45 every time you’d buy it, like it get hot one day. 0:12:47 And then the thing would just permanently die. 0:12:49 – Yeah, I agree too. 0:12:51 But like, I feel like it’s slightly different 0:12:52 with software versus hardware, right? 0:12:54 Like if you buy an Xbox one 0:12:55 and it’s like broken out of the box, 0:12:56 you’re pissed off, right? 0:13:00 Cause it’s not like, like, you probably have to replace it. 0:13:03 It’s, if it’s like red ring of death, 0:13:04 a firmware update is not going to fix that. 0:13:05 Cause you can’t boot it up 0:13:08 to get the firmware update installed, right? 0:13:13 A software thing like Anthropic can push out a new feature 0:13:16 and know that it’s not fully what they want it to be, 0:13:19 but then pretty quickly like push out new updates 0:13:22 to get it up to where people want it, right? 0:13:24 – Yeah, but both approaches make sense 0:13:25 for like their current states, right? 0:13:26 Like open AI is the leader. 0:13:28 They have more eyes on them. 0:13:29 If you release something bad, 0:13:31 people could in theory not come back 0:13:32 or just have a horrible experience 0:13:34 and it’s like, I’ve never gone try that again. 0:13:35 – Yeah. 0:13:37 – Whereas Anthropics in catch up mode 0:13:39 to anything they can do for attention is smart. 0:13:39 – I agree. Yeah. 0:13:41 I think that’s kind of the thing right now 0:13:45 is that I feel like the narrative among AI people, right? 0:13:46 The people that are sort of paying attention 0:13:48 and talking about AI all the time, 0:13:50 the sort of narrative is that like open AI 0:13:54 shares a lot of cool stuff, but never really ships, right? 0:13:56 And I think Anthropic might be seeing that going, 0:13:57 all right, let’s be the company that ships 0:13:59 if we really want to compete with open AI. 0:14:00 – The feature sounds awesome. 0:14:02 I mean, like the idea to personalize, 0:14:05 I want to try one thing that’s really been annoying me 0:14:08 with Claude is that, you know, I typed really fast 0:14:09 and I used to be way more accurate. 0:14:11 I feel like as I’ve gotten older, 0:14:13 I still type incredibly fast, but I do make more errors. 0:14:14 – Right. 0:14:15 – And I liked with Chatcha BT 0:14:17 that I could still type incredibly fast. 0:14:18 And if there’s typo, who cares? 0:14:21 Because it would always just, it would know what you meant. 0:14:22 And I love that. 0:14:24 It’s like, that’s like a magical thing for me. 0:14:27 ‘Cause like type super fast, make a mistake, who cares? 0:14:29 – And you feel like Claude doesn’t do that? 0:14:31 – Claude corrects me. 0:14:33 And I find it super annoying. 0:14:35 I’m like, I didn’t know why. 0:14:36 It’s like, I don’t want this thing called Claude, 0:14:38 like correcting me. 0:14:40 Like, oh, by the way, you typed, you know, 0:14:41 you spelled it this way or whatever. 0:14:42 It’s like, I don’t know, whatever. 0:14:44 But Google’s been doing that forever. 0:14:44 Right? 0:14:45 Like, did you mean? 0:14:46 – Yeah. 0:14:47 – And then like. 0:14:48 – Yeah. 0:14:49 But open-ag does not do that, right? 0:14:50 So I just, like, 0:14:51 Chatcha BT does not do that. 0:14:53 So I think that’s actually been a really annoying thing 0:14:54 for me with Claude. 0:14:56 Like literally, like, when that happens, 0:14:57 I’m like, screw this program. 0:14:58 I know what it is. 0:14:59 – That’s funny. 0:15:00 ‘Cause I’ve actually never noticed that. 0:15:01 I don’t know if I’ve ever seen it, 0:15:02 like correct me like that. 0:15:07 But like moving along, I know V zero has some updates. 0:15:10 In fact, the last episode we put out, 0:15:11 we actually played around with V zero 0:15:16 a little bit, but I don’t know much about the update. 0:15:17 But you mentioned before we hit record 0:15:19 that V zero has some new updates. 0:15:20 So. 0:15:22 – And this shows you how fast things are moving AI. 0:15:25 Like probably 30 minutes after we recorded our episode, 0:15:27 they released a new update to V zero 0:15:29 that everyone’s been talking about. 0:15:31 And it, you know, we showed in the last episode 0:15:33 that you could clone a website pretty easily with V zero, 0:15:34 but it wasn’t perfect. 0:15:36 There’s a lot of things that were different. 0:15:39 You know, it was probably like a 70% match 0:15:40 or something like that. 0:15:41 I would say they’re like in the realm 0:15:46 of like a 95% match now, which just, you know, 0:15:48 like for anyone who’s creating a new website now, 0:15:52 you could go to V zero and show it a UI that you like 0:15:53 or design. 0:15:54 – I think you can just give it a URL too. 0:15:57 Can’t you just say, I want a site that looks like this URL 0:15:58 and it’ll actually look at the URL for you? 0:16:01 – Well, what it’s doing is it looks at the URL 0:16:02 and it takes a screenshot. 0:16:03 – Yeah. 0:16:05 – And then it puts that into the system, 0:16:06 into the LLM or whatever. 0:16:07 And so that’s what they’re doing. 0:16:09 So like you could do it, if you did a design, 0:16:11 then obviously the same thing would work. 0:16:12 You design it, you put it in there 0:16:13 and it would give you the basic code 0:16:15 for that design as a website. 0:16:17 So like it’s unlocked so many new opportunities 0:16:20 and you know, you could have people copying people, 0:16:21 which you know, I’m not really a big fan of, 0:16:23 but that will definitely be happening. 0:16:25 So if you’re in business, be aware that people 0:16:27 will now be able to just literally type in your URL 0:16:29 and say, I want that, give me that. 0:16:31 And I don’t want to pay anyone for it. 0:16:33 I don’t want to hire the developer or anything. 0:16:35 I just want that. 0:16:37 And you’ll get like 95% of the way there. 0:16:40 The other big thing that, and the design is pretty good. 0:16:42 I would say VZero has got the best UI and design 0:16:43 out of any of them. 0:16:44 Yeah, I agree with that. 0:16:45 Yeah, I mean, it makes sense. 0:16:47 Like the founder I’ve actually met him, 0:16:49 I know him, Guillermo Roche, he’s a great guy. 0:16:51 And he’s really design focused. 0:16:53 You know, he’s always been like a huge fan of like Apple 0:16:55 and he loves Steve Jobs. 0:16:58 And so it makes sense that he’s super UI and design focused. 0:16:59 So they definitely have the best design 0:17:02 out of any of the like code generators with AI. 0:17:05 But also apparently it got way better on the back end. 0:17:06 Like apparently now when it, 0:17:08 not only is it like creating the landing page for you, 0:17:09 but if you want some kind of back end, 0:17:10 I don’t think it does everything yet, 0:17:12 but apparently it makes it way easier 0:17:13 to connect it to something right now. 0:17:14 And you could even, 0:17:16 it’ll help you to generate a database as well. 0:17:17 And you could connect it yourself. 0:17:19 And it does things even like, 0:17:21 like see, you know, secret keys and stuff like that. 0:17:24 If you know, if you have like a password for a database, 0:17:26 you can even store the environment variables 0:17:28 and stuff like that, which is the kind of thing 0:17:29 where when you’re coding, you might want something in there, 0:17:32 but you don’t want to put the password in the code. 0:17:34 You would have like a secret that handles all that 0:17:36 for you now and helps you do it. 0:17:38 Yeah, I wonder how many times people just put an API key 0:17:41 directly into their code and then throw that software 0:17:43 on GitHub and now everybody has your API key. 0:17:46 It happens pretty often, I’m pretty sure. 0:17:48 Yeah, I mean, so if they keep improving this fast, 0:17:50 I mean, it seems like it’s totally possible 0:17:52 that within like three months, 0:17:55 you’re going to be able to, like basic SaaS apps, 0:17:58 probably copy most of their features, 0:18:01 including the back end and everything and the UI. 0:18:04 And so that’s a, it’s a new world to be in. 0:18:05 I mean, you’ll definitely see, you know, 0:18:08 there used to be those guys that forgot the guy’s name, 0:18:09 the two brothers in Russia. 0:18:11 They used to be the two guys who were notorious, 0:18:12 like Silicon Valley hated them 0:18:14 ’cause they literally, like they just watched, 0:18:17 like they’d read TechCrunch and like any cool startup 0:18:19 that was like seemed to be getting traction, 0:18:20 they just copy them. 0:18:22 Oh, I remember hearing about that story a while ago. 0:18:25 I forgot their names, but they were like notorious. 0:18:27 And so you’ll probably say this, like even, 0:18:28 like at a larger scale, right? 0:18:30 ‘Cause you’ll, you’ll be able to copy anything 0:18:31 and put it in another language or whatever. 0:18:32 Yeah, yeah. 0:18:34 I mean, I don’t really see anything wrong 0:18:37 with like using another website for inspiration, right? 0:18:38 Like use it as a starting point, 0:18:40 but then like completely overhaul it, right? 0:18:44 Like, okay, I really like the way that, you know, 0:18:47 TechCrunch’s blog looks, make it look like that. 0:18:49 Okay, now I’m going to go change the colors. 0:18:51 Now I’m going to go change the font a little bit. 0:18:54 I’m going to go, and next thing you know, it, you know, 0:18:57 it has similarities, but it’s not the same website anymore. 0:18:59 Like I feel like that’s what it was designed for, 0:19:02 but I don’t think that’s what is going to be used for. 0:19:04 – Yeah, but I mean, the reality is though, 0:19:06 like web designers have been doing that for a long time, 0:19:08 right, there’s like, you pay them a ton of money 0:19:10 and then like you actually hang out with designers 0:19:12 and see what they do and a lot of it is like, 0:19:14 they got a bunch of samples of things they like, 0:19:16 they put it into a Figma and then they copy little bits 0:19:18 of pieces and edit some stuff 0:19:19 and then that’s kind of what they do, right? 0:19:21 – I’ve hired a whole bunch of designers myself 0:19:22 over the years, right? 0:19:23 And they always send you a questionnaire 0:19:26 and one of the questions on the questionnaire is like, 0:19:29 give me two or three websites that you really, really like 0:19:30 so that we could like, you know, 0:19:32 use that as a starting point, right? 0:19:33 So they’re always asking you like, 0:19:36 what sites do you want your site to be inspired by? 0:19:37 – So, I mean– 0:19:39 – Yeah, so on the positive side, like if you’re a company 0:19:42 or an individual looking to create like a new project 0:19:44 to test it out, especially look, you know, 0:19:47 we’re both like creators with decent followings. 0:19:48 I think like creators in a great spot 0:19:51 because if you want to test out a concept 0:19:53 and then share it and just see what happens, 0:19:54 it’s gonna be easier than ever. 0:19:56 It’s gonna be easier than even like Framer 0:19:56 and stuff like that. 0:19:59 ‘Cause at some point you’ll be able to do like AI voice too, 0:20:00 right? 0:20:01 And just like, just chatting with you like, 0:20:02 oh, I want to make something like this. 0:20:04 And like, oh cool, it’s done five minutes later, 0:20:06 tweet it out, see if people are interested. 0:20:07 They’re not, who cares? 0:20:09 You know, there’s like people forget a week later anyway. 0:20:12 So it’s gonna really change 0:20:14 how people create new products, I think. 0:20:15 – Yeah, for sure. 0:20:17 I don’t know if you’ve heard about this one yet or not, 0:20:22 but like Uber is getting into the game of AI labeling, 0:20:24 which, you know, is what scale AI does, right? 0:20:29 So basically they’re trying to start like 0:20:33 side hustle kind of thing where anybody can go 0:20:36 and look at a whole bunch of AI-generated images 0:20:40 and then label them or go read a whole bunch of dialogues 0:20:44 from chat transcripts and, you know, rank them 0:20:46 of like how good it did. 0:20:49 And you can actually get paid by Uber 0:20:51 to actually go and do that. 0:20:54 So Uber is getting into the like data labeling business 0:20:59 as a side hustle gig economy thing, 0:21:01 which I find really, really fascinating. 0:21:03 – Yeah, yeah, well, I mean, I think Uber 0:21:05 is in a very odd position. 0:21:08 They haven’t been really innovative in a long time. 0:21:10 You know, they fired Travis, the founder, 0:21:12 who very controversial figure. 0:21:14 I think I told you I met him before 0:21:17 and had a pretty negative interaction with him, 0:21:19 but he did do an amazing job at leading the company. 0:21:21 And ever since he’d been gone, 0:21:22 like they haven’t really done that many 0:21:24 new innovative things. 0:21:27 And now with, you know, the rise of like Tesla 0:21:30 and, you know, you’re gonna have like AI taxis, 0:21:32 you know, driverless taxis are coming. 0:21:33 Like they’re definitely coming, 0:21:37 especially now with like the new Elon Trump buddy party, 0:21:39 you know, whatever’s going on there, 0:21:42 like they’re definitely coming now. 0:21:43 And so since that’s happening now, 0:21:47 if I was Uber, I would be kind of terrified, honestly, 0:21:51 because they’re obviously behind in AI compared to Tesla, 0:21:53 like probably like dramatically behind. 0:21:56 And so this may be an effort to kind of do two things at once. 0:21:57 There are a few things. 0:21:59 I mean, one is you’re like kind of hedging your bed. 0:22:01 So if one of your business, you know, 0:22:04 units fall off when you’ve got this new business unit. 0:22:06 Another thing is you’re building up technical talent 0:22:08 for AI, which they’re gonna need for driverless. 0:22:11 And then at the same time, backup plan, 0:22:13 maybe Tesla or someone buys you. 0:22:16 Like he made yourself a more attractive acquisition target 0:22:18 because you have a better AI unit 0:22:20 and something that could be useful for them. 0:22:22 So that’s why I think it’s going on. 0:22:24 – Well, I also think, you know, on the flip side of it, 0:22:28 the sort of like side hustle, make money side of it, 0:22:31 it kind of pushes a little towards like a theory 0:22:33 I’ve talked about in the past of like, 0:22:36 I don’t know if the government is gonna offer 0:22:38 like a UBI kind of thing in the future. 0:22:39 I think what we might see, 0:22:42 at least in like a nearer term future 0:22:44 is like a lot of these companies 0:22:47 are going to figure out ways to do this like gig work 0:22:50 where people are making money doing stuff like this. 0:22:53 So if you’re like out of your accountancy job 0:22:56 because AI does accountant work now, 0:22:58 well now maybe you were making money by, 0:22:59 you know, driving people around 0:23:01 until the cars just do it for you 0:23:05 or labeling AI data for some of these companies, right? 0:23:08 I feel like that might be where that’s like 0:23:11 the sort of next like economical shift 0:23:13 is a lot more like people taking on a whole bunch 0:23:16 of like gig work to make their money 0:23:18 versus having like a traditional career path. 0:23:21 And I feel like this is sort of like a step 0:23:23 in that direction in my opinion. 0:23:24 – Yeah, that makes sense. 0:23:26 You know, it is interesting that 0:23:28 Elon Musk has been a proponent of UBI, 0:23:30 but now he’s really, you know, 0:23:31 more on the Republican side, 0:23:34 which is very like sees UBI as like a socialist concept 0:23:36 that you have to avoid. 0:23:38 I think the reality is gonna be somewhere in the middle, 0:23:40 probably like what people actually have to do 0:23:41 where like you probably, 0:23:43 eventually with AI when things become more abundant, 0:23:45 it probably will make sense for there to be something 0:23:48 where people’s basic living expenses are covered in some way. 0:23:51 But then maybe you do also have a gig economy on top of that 0:23:52 where there’s like actual extra incentives. 0:23:54 Like, okay, if you don’t wanna just like sit around 0:23:56 and play video games, 0:23:57 well like here’s extra incentive to go off 0:24:00 and actually do something productive, right? 0:24:01 – Well, I mean, even saying all that though, 0:24:04 like the sort of conservative, 0:24:05 like the Republican party, 0:24:08 almost all of the top people in the Republican party 0:24:10 were at one time Democrats, right? 0:24:13 So it’s just like a sort of like shifting 0:24:14 of that sort of line. 0:24:17 – Yeah, yeah, yeah, it’s just a label 0:24:19 and the label that what it means has changed. 0:24:21 – Yeah, for sure. 0:24:23 You know, and speaking of Elon Musk, 0:24:25 this is another news story that just came out 0:24:28 is that the rumor has it, 0:24:32 he wants to create an app to go head to head with chat GPT. 0:24:35 Now, obviously we’ve got Grock inside of X 0:24:38 and you can open up the X app on your phone 0:24:41 and use Grock directly inside of your phone. 0:24:45 But it sounds like he wants to make a standalone XAI app 0:24:47 that is like basically chat GPT, 0:24:52 but using the XAI technology and supposedly unbiased 0:24:55 and all the kind of stuff that he stands for. 0:24:57 – Yeah, I mean, we talked about this a little bit off camera, 0:25:00 but like, my take on it is he wanted X 0:25:02 to be the everything app, 0:25:04 which he clearly stated was 0:25:07 because he saw WeChat in China 0:25:08 and he realized that WeChat in China 0:25:10 is an app where you do everything. 0:25:11 There’s games in there, 0:25:13 there’s shopping, there’s social media, 0:25:16 there’s your banking, it’s all in one. 0:25:19 I think one thing that he probably didn’t really realize 0:25:22 is like the way users interact with apps in China 0:25:24 and America and Europe and everything, 0:25:26 it’s all very different. 0:25:28 And like in Japan, websites here, 0:25:30 like there’s some websites that are so, 0:25:31 there’s so much data on the website. 0:25:33 They’re like a, America would be like, 0:25:35 what is this horribly ugly website? 0:25:36 This is horrible. 0:25:38 What is all this nonsense on there? 0:25:39 There’s too much stuff. 0:25:41 I don’t know what to focus on. 0:25:43 For whatever reason, people in Japan love that. 0:25:47 They love there to be information dense. 0:25:49 Joe Yido from MIT, he’s talked a lot about this. 0:25:51 So like Japanese people, whatever reason, 0:25:53 they love UIs where things are information dense, 0:25:55 tons of information. 0:25:57 And I think that’s a kind of problem you run into 0:26:00 is like most Americans for whatever reason, 0:26:03 with X, they’re gonna think of it doing one thing. 0:26:04 You tweet on there. 0:26:05 I think he’s gonna have a hard time convincing people 0:26:07 that it’s a YouTube too. 0:26:08 – Yeah. 0:26:08 – Right? 0:26:10 ‘Cause like, no, they’ve got one thing in mind. 0:26:12 – Well, I mean, he’s trying to convince people 0:26:15 that it’s YouTube, it’s the next PayPal, 0:26:18 the next payment platform, it’s the next chat GPT, 0:26:21 it’s the, you know, the list goes on and on and on, 0:26:22 like literally the everything app, 0:26:25 like he wants you to just live on X eventually. 0:26:26 – Yeah, yeah. 0:26:28 He’s like I said, he’s modeling off of, you know, 0:26:31 China’s WeChat and people are different, 0:26:32 have different things that they want. 0:26:33 I don’t think it’ll work in America. 0:26:36 – Yeah, yeah, I think that’s a tough sell. 0:26:37 – Yeah, as a standalone, maybe it could work. 0:26:39 Like if he makes it cool, especially if he died, 0:26:42 like I could see it working ’cause like we said in the past, 0:26:45 GROC is really good at doing like funny images and stuff, 0:26:48 stuff that like the other ones won’t let you do. 0:26:50 So like if he does a good standalone app, 0:26:52 that now is all of a sudden the best one 0:26:56 at generating, you know, funny AI art, right? 0:26:57 Like memes or whatever. 0:26:59 I can see that being super successful. 0:27:00 – Yeah, yeah. 0:27:03 And I mean, right now GROC, it uses the FlexOne Pro, 0:27:05 which is probably the best one out there 0:27:08 at generating funny memes and, you know, realistic images. 0:27:10 So. – Yep, yep. 0:27:11 – So lots of interesting stuff going on 0:27:13 in the world of AI right now. 0:27:14 If you like stuff like this, 0:27:17 make sure you’re subscribed to this channel on YouTube 0:27:19 or Spotify or Apple podcasts. 0:27:20 Wherever you tune into podcasts, 0:27:21 make sure you’re subscribed 0:27:24 and we’ll keep on putting out podcast episodes 0:27:25 like this for you. 0:27:26 Thank you so much for tuning in. 0:27:28 We’ll see you in the next one. 0:27:30 (upbeat music) 0:27:33 (upbeat music) 0:27:35 (upbeat music) 0:27:38 (upbeat music) 0:27:41 (upbeat music) 0:27:43 (upbeat music) 0:27:45 you
Episode 36: What happens when AI technology leaks, and who benefits the most? Matt Wolfe (https://x.com/mreflow) and Nathan Lands (https://x.com/NathanLands) discuss OpenAI’s recent leak involving their video generation model Sora.
In this episode, Matt and Nathan dive into the implications of the Sora leak, how it affected OpenAI, and what it means for the future of AI video generation. They also explore the resulting industry reactions, the complexity of maintaining NDAs, and how AI companies like Anthropic are releasing innovative features to stay competitive.
Check out The Next Wave YouTube Channel if you want to see Matt and Nathan on screen: https://lnk.to/thenextwavepd
—
Show Notes:
(00:00) Artists frustrated over restricted video sharing rights.
(04:12) Creators lost access to flawed Sora tool.
(06:31) Anthropic’s Model Context Protocol connects AI data.
The Next Wave is a HubSpot Original Podcast // Brought to you by The HubSpot Podcast Network // Production by Darren Clarke // Editing by Ezra Bakker Trupiano
Testing 4 New Ai Coding Tools: Bolt vs V0 vs Replit vs Websim.AI
AI transcript
0:00:04 A year from now, these apps are going to be like absolutely mind blowing. 0:00:08 Like you probably will be able to prompt a whole SaaS company into existence, right? 0:00:12 Yeah, and the kind of stuff that you’d have to be a developer before to do any of this, right? 0:00:12 Yeah, yeah. 0:00:16 So if you’re in business, I would be thinking about what that means in terms of the opportunities, 0:00:19 like what are you going to be able to do with this new technology once it gets that good? 0:00:26 The way I’ve been using these tools is to create just like really basic, simple mini apps that sort of clear bottlenecks my business. 0:00:31 Hey, welcome to the Next Wave podcast. 0:00:32 I’m Matt Wolf. 0:00:42 I’m here with Nathan Lanz, and we’re entering into a world right now where the gap between having an idea in your brain and actually seeing it working app, 0:00:48 a working piece of software on your computer screen, that gap is closing quickly. 0:00:56 And we’re entering a really, really exciting time right now where anybody can write code to do pretty much anything they need it to do. 0:01:02 And there’s a whole new crop of up and coming AI coding tools that we’ve been hearing so much about. 0:01:09 And we wanted to put them all through their motions, tools like Bolt, tools like V0, WebSim, Replet AI. 0:01:16 These are some of those AI coding tools that we’re starting to see everywhere and people are just raving about them. 0:01:23 So we wanted to find out what they’re good at, what they’re not so good at, and really put them through the motions and see what we can build with them. 0:01:29 So let’s go ahead and just jump right in and play with some of these up and coming AI coding tools. 0:01:35 When all your marketing team does is put out fires, they burn out. 0:01:40 But with HubSpot, they can achieve their best results without the stress. 0:01:49 Tap into HubSpot’s collection of AI tools, Breeze, to pinpoint leads, capture attention, and access all your data in one place. 0:01:52 Keep your marketers cool and your campaign results hotter than ever. 0:01:55 Visit HubSpot.com/marketers to learn more. 0:02:11 We recently did an episode with Riley Brown where we walked through building out an entire app step by step with Riley using mostly cursor for that project. 0:02:18 But there’s been like this new crop of AI tools popping up like Bolt.new and V0 from Bercel. 0:02:20 And what was the other one? Replet? 0:02:21 Replet AI agent, yeah. 0:02:27 Right, yeah, so that we’ve got all of these new AI tools that are popping up that are just making it so much easier to code. 0:02:31 And I mean like even easier than what we saw with cursor, right? 0:02:39 Like with cursor, even when Riley Brown showed it to us, we kind of started with a template and then we had like a starting point. 0:02:43 And then we used cursor to build on top of that template that Riley provided us, right? 0:02:47 Well, with these other tools like Bolt and V0, you don’t even need that template, right? 0:02:51 You just go in there and you say, I want an app that does X. 0:02:54 And it just builds it and it gives you a working app. 0:02:59 So that’s what we want to do on this episode is play around with some of those tools, put them through their motions a little bit. 0:03:05 Talk about the implications and see what we can get them to build for us on this episode. 0:03:08 Yeah, I guess let’s just jump in to Bolt. 0:03:10 Yeah, so Bolt is like really simple, right? 0:03:13 You can just, you know, give it a concept and it’ll build it out. 0:03:20 I like to ask it to build a game first because I always feel like games are a lot more complex than what like most businesses are going to use it for. 0:03:27 And it’s sort of like a really good demonstration of something more complicated than what you’re probably even going to use it for. 0:03:29 Last time I did it, I had it build Tetris. 0:03:33 We see everybody on earth use the snake example. 0:03:34 So I don’t want to do Tetris or a snake. 0:03:37 I was thinking maybe like space invaders. 0:03:39 I’m going to have to build space invaders real quick. 0:03:45 So I can just give it the prompt, build a space invaders game. 0:03:47 And then just like, that’s it. 0:03:48 That’s the whole prompt. 0:03:50 One, two, three, four, five words. 0:03:52 And then you can see on the screen it’s just coding. 0:03:54 I’m not, I’m not touching anything. 0:03:57 It’s going in right and all the code you can see over on the left. 0:04:03 It’s already, you know, got all the pages and everything’s already built and ready to go. 0:04:05 It’s just, it’s, it’s just creating it. 0:04:06 I’m not doing anything. 0:04:11 And let’s see, let’s see if just one prompt is enough to make this work. 0:04:16 We can see here over on the left, it says I’ll create a modern beautiful space invaders game with react and type script. 0:04:21 It will feature smooth animations, particle effects and responsive controls. 0:04:24 So here it says I’ve created a modern space invaders game with the following features, 0:04:30 smooth canvas, particle effects, score tracking, modern UI, responsive controls with keyboard input, 0:04:35 beautiful start and game over screen, performance optimized with proper frame timing. 0:04:40 So you can see over here on the right of my screen, we’ve got space invaders. 0:04:41 Random side start. 0:04:45 But this is reminding me of back in the day when I launched my startup gamify.com. 0:04:50 We had this, this waiting list and when you would join the wait list after you, 0:04:54 after you gave your email, you would be prompted to fill out this puzzle, 0:04:56 which was like using our logo and you had to put the puzzle pieces in correctly. 0:04:58 And it timed how fast you could do it. 0:05:01 And then we had a, had a leaderboard and everyone was like sharing this. 0:05:03 It was like this, like crazy viral things. 0:05:06 They kind of cool to like replicate something like that where you have like a little, 0:05:09 little game and use that as a marketing tool of some kind. 0:05:10 Yeah, it’s so cool. 0:05:13 That gives you all the, all the code and like does all the folder structure and everything for you. 0:05:18 Like, I remember some of the early AI website generator apps or whatever. 0:05:19 A lot of them didn’t give you code. 0:05:22 Like you just see like the final product and it’s like, okay, now what? 0:05:24 Yeah, yeah, yeah. 0:05:30 What I really wanted to talk about on this episode was the concept of like just creating 0:05:32 little mini apps for your business. 0:05:38 I think a lot of people see these tools like cursor and bolt and V zero and replet and all 0:05:40 of these kinds of things. 0:05:42 And they think, well, I’m not trying to build a software company. 0:05:44 I’m not going to go and launch a SAS. 0:05:47 That’s, that’s just, you know, I’ve already got my business. 0:05:49 I don’t want to go do that. 0:05:49 Right. 0:05:54 But the way I’ve been using these tools is to create just like really basic simple mini 0:05:57 apps that sort of clear bottlenecks my business. 0:05:57 Right. 0:06:02 Like one thing I was running into is a lot of times I’m pulling images off the internet 0:06:06 and they get pulled as, as web P files. 0:06:06 Right. 0:06:10 Like if you generate an image in Dolly inside of open AI and you download it to your computer, 0:06:13 it downloads it as a web P file. 0:06:14 I really hate web P files. 0:06:16 They’re kind of annoying to work with. 0:06:18 I want it to be converted to a JPEG file. 0:06:23 I can’t use web P files in the Vinci resolve when I’m editing my videos to overlay them 0:06:23 on the videos. 0:06:25 I need JPEG files. 0:06:25 Right. 0:06:30 So I created a simple little app with a single prompt where I can drag and drop as many web 0:06:33 P files into that, into that little app. 0:06:36 And it would just convert them all into JPEGs like instantly. 0:06:41 It just created this little Python script called web P converter.py. 0:06:42 I can open it. 0:06:46 It just says web P to JPEG converter. 0:06:50 Here’s where the files will be saved code slash web P converter slash saved. 0:06:52 That’s the folder right there. 0:06:56 I made it so you can like change the folder pick wherever you want them to automatically 0:06:57 be saved. 0:07:01 And then I’ve got let’s you can see I got these three web P files that I just dragged over here. 0:07:05 One’s a picture of a tongue that I downloaded for some reason. 0:07:07 Here’s a picture that you’re telling. 0:07:08 Okay, good. 0:07:09 I think this is like a doll. 0:07:11 Yeah, this is a dolly image. 0:07:15 And this one’s also a dolly image, but they’re all web P files. 0:07:18 And so if I grab all three of these, just drag them and drop them into this little app that 0:07:19 I built here. 0:07:22 And it says conversion complete successful three failed zero. 0:07:27 And then if I look at my saved folder here, it just saved all three of those as JPEGs here. 0:07:28 And like really, really simple app. 0:07:32 I know there’s other apps on the internet that already can do that kind of stuff. 0:07:36 But I wanted to be able to just do it in bulk really quickly without, you know, 0:07:39 ads all over the screen and all that kind of crap that you get when you go to them. 0:07:42 It’s like online file converters. 0:07:47 And so I just I had both built me this little simple app really, 0:07:50 really quickly downloaded the files on my computer. 0:07:55 And now like if I’m working in DaVinci and I’m trying to find, you know, images to overlay 0:07:59 into my videos, I just have this little thing open on my screen whenever I downloaded image. 0:08:00 I throw it in there. 0:08:01 It gives me the JPEG back. 0:08:06 It just saves me a few extra seconds on converting those files. 0:08:12 Because the way I used to do it was I would open it up in like the windows image viewer app. 0:08:15 And then I would re save it as a JPEG file. 0:08:18 And this is just like way quicker. 0:08:24 So that’s the way I’ve been seeing these these apps is just like little ways to sort of solve 0:08:25 bottlenecks in my business. 0:08:29 Like what’s something that I just I want to solve this little problem. 0:08:32 I don’t really need to give this app away or sell it. 0:08:33 It’s just for me. 0:08:35 And I found that really, really helpful. 0:08:41 So, you know, I think anybody can look at their business and go where are the bottlenecks? 0:08:44 Where are the things that slow me down just a little bit? 0:08:49 Can I create a little quick script that’s going to make that thing just slightly easier? 0:08:51 Speed up the process just a little bit. 0:08:56 And that’s why I’m loving about these tools is just like finding those little things that 0:08:58 just make my life a little bit easier. 0:09:02 Yeah, and the kind of stuff that you’d have to be a developer before to do any of this. 0:09:02 Right. 0:09:06 We’re, you know, now if you’re in it, you could be a CEO and have an idea of something 0:09:08 that makes your business a little bit better a tool, a simple tool. 0:09:11 And maybe before you’d have to hire an entire team of engineers to build it. 0:09:14 And now if it’s something very simple in there, you could just use something like Bolt. 0:09:18 Let’s try something else inside of Bolt though, because it is actually a lot more impressive 0:09:19 than this here. 0:09:24 I’ll even let me show you real quick some of the other stuff that I already built first. 0:09:26 Because I thought this was really impressive. 0:09:32 So I built this Tetris game here and it’s just, you know, it’s Tetris like you’d expect it. 0:09:35 And it looks really, really good. 0:09:36 Right. 0:09:40 So, you know, I’m just obviously like just dropping everything really, really quick. 0:09:42 But it works. 0:09:46 Trust me, like the lines actually all the scoring works. 0:09:47 So you can see that that worked. 0:09:49 So that was one of the apps that I created. 0:09:52 This other one was just like a little fun one that I was playing around with, 0:09:55 which is in an ASCI generator. 0:09:56 Right. 0:10:03 So what you do is you drop any image in and then it recreates that image as like words and letters. 0:10:06 So just kind of like some of the random stuff that you can do with it. 0:10:09 But yeah, do you have an idea of like another app we can test real quick? 0:10:13 I think it’d be kind of cool to see what it does when you give it to a, 0:10:17 like a screenshot of a SAS app or something. 0:10:20 Like can you, can it copy an app like from a screenshot? 0:10:20 We can try. 0:10:22 Do you have a specific website in mind? 0:10:25 How about linear? 0:10:26 Linear, yeah. 0:10:29 Like linear, is that linear.com? 0:10:29 Dot app. 0:10:33 So you want me to just take a screenshot of like what we see above the fold here? 0:10:34 Yeah, thanks. 0:10:34 So let’s just see. 0:10:36 What does it do? 0:10:38 Does that have any idea what to do? 0:10:42 Is it like totally off or is it like close and the graphics are just wrong? 0:10:44 So I just took a screenshot told it to recreate this page. 0:10:46 So let’s see what it does here. 0:10:48 I’m not even familiar with linear. 0:10:51 So I don’t know what we’re, what it even is. 0:10:52 All right. 0:10:56 So now we’ve got a screen that has got some text on it. 0:10:57 Let’s see. 0:11:01 What should I, what prompt should I give it to try again? 0:11:02 Let’s see. 0:11:04 There are no images. 0:11:12 This looks nothing like the original image I gave you. 0:11:16 Please make it better. 0:11:18 Let’s just see what it does. 0:11:19 Oh, you’re fired. 0:11:20 Add that and see if it works. 0:11:21 Or you’re fired. 0:11:21 Yeah. 0:11:22 I’ll try that next time. 0:11:24 We’ll start threatening it with the next prompt. 0:11:28 No, just went back to just to wait. 0:11:29 Oh, wait, no. 0:11:30 Okay. 0:11:33 Now we’ve got some really crappy looking gradients in the background. 0:11:38 We’ll be right back. 0:11:41 But first I want to tell you about another great podcast you’re going to want to listen to. 0:11:45 It’s called Science of Scaling, hosted by Mark Roberge. 0:11:48 And it’s brought to you by the HubSpot Podcast Network. 0:11:51 The Audio Destination for Business Professionals. 0:11:56 Each week, host Mark Roberge, founding chief revenue officer at HubSpot, 0:12:00 senior lecturer at Harvard Business School, and co-founder of Stage 2 Capital, 0:12:05 sits down with the most successful sales leaders in tech to learn the secrets, strategies, 0:12:08 and tactics to scaling your company’s growth. 0:12:13 He recently did a great episode called How Do You Sol for a Siloed Marketing in Sales? 0:12:15 And I personally learned a lot from it. 0:12:17 You’re going to want to check out the podcast. 0:12:20 Listen to Science of Scaling wherever you get your podcasts. 0:12:29 So far, Bolt seems to be pretty good at making games and like novel stuff. 0:12:33 So I just told it like create me a simple directory website for lower.com, 0:12:36 make it like educational side about the lore of different fantasy worlds. 0:12:42 So this is like, and then make it beautiful, just simple kind of website kind of works. 0:12:45 And obviously, if anybody is watching this or listening to this, 0:12:52 and it seems like this worked really fast, we are skipping over a lot of the actual writing 0:12:56 of the code in the video just so that it speeds up the flow of the video here. 0:13:01 So yeah, I think Bolt is not amazing when you try to upload an image 0:13:03 and try to get it to like duplicate that image. 0:13:04 That’s what I’m finding out. 0:13:09 But I think if we were to just like explain what website we want, 0:13:11 Bolt would do a pretty decent job. 0:13:14 So now, Nathan, you’re sharing your screen. 0:13:16 You didn’t give it an image this time, right? 0:13:18 You just kind of told it what you wanted. 0:13:20 Yeah, I was just thinking like, you know, 0:13:22 let’s imagine I was using lower.com for something different. 0:13:27 Like I was using it for what a lot of people imagine would be like a directory about fantasy 0:13:30 or video games or like the lore behind different fantasy worlds and stuff. 0:13:33 So I said, create me a simple directory website for lower.com, 0:13:37 make an educational side about the lore of different fantasy worlds, etc. 0:13:40 Use Astro, which is a framework I like to use for static websites. 0:13:42 And make it beautiful. 0:13:47 So this is what Bolt gave me, which I don’t think is very beautiful. 0:13:52 I mean, it’s a lot more beautiful than what I got when trying to start with an image. 0:13:57 And here’s what V0 from Vercel gave me, which I think it’s simple, kind of minimalistic, 0:13:59 but I think it does look better. 0:14:01 What I think the big difference is, is like, 0:14:03 Bolt didn’t pull in any icons. 0:14:05 It didn’t pull in any images. 0:14:07 It didn’t like do any of that stuff. 0:14:12 At least when you used V0, it actually added some icons to try to make it look good. 0:14:13 Yeah, yeah. 0:14:15 And it also, it’s not showing it here. 0:14:16 I’m trying to get to reload. 0:14:17 It’s not doing it again. 0:14:20 But when it first loaded, all of this was like animated too. 0:14:21 And it was really fancy looking. 0:14:22 So nice. 0:14:23 I thought that was cool. 0:14:27 Like it like had some basic animations, like all these like the words slowly came in, 0:14:29 you know, and like it like flashed in all that kind of stuff. 0:14:31 So yeah, yeah. 0:14:34 V0 definitely has like a lot better design to it. 0:14:40 It’s just, it’s so fascinating to me because when I was using Bolt the first time, 0:14:44 like everything I was doing was working on like either the first or second prompt. 0:14:47 Sometimes I had to get a second prompt and then it started working really well. 0:14:48 But it was always within two prompts. 0:14:50 I got what I was looking for. 0:14:53 I also tried this on Replet. 0:14:56 And so their approach is like really different because I think it’s interesting. 0:15:02 It’s almost like you’re chatting with like an AI project manager or something, right? 0:15:06 Or instead of like, you know, Bolt and V0 just go ahead and kind of build it for you 0:15:08 and don’t really ask for additional input. 0:15:12 Like this is asking me like, it’s wanting to clarify what I want. 0:15:12 Right? 0:15:13 Right. 0:15:14 They, you know, they ask me other things. 0:15:16 Would you like any of these additional features? 0:15:17 We can also make changes later. 0:15:20 So don’t want to add interactive timeline features. 0:15:20 Sure. 0:15:23 Don’t want to implement character relationship maps. 0:15:25 If they can do that, I don’t know. 0:15:25 That’d be awesome. 0:15:27 Let’s see if they can actually do that. 0:15:29 Add a user contribution system. 0:15:30 I’m just going, I’m going to say yes, all of this. 0:15:32 Can you do me a favor real quick though? 0:15:35 And just like read out the prompt that you gave it just for anybody who’s listening on audio. 0:15:38 I want to make sure that they kind of have a little more context. 0:15:40 And I also was very small on my screen so I can’t read it. 0:15:41 Yeah, yeah. 0:15:43 Yeah, the same prompt I used for the other one. 0:15:45 Create me a simple directory website for lord.com, 0:15:49 make an educational site about the lore of different fantasy worlds, etc. 0:15:51 I told a web framework I wanted to use. 0:15:53 I said use Astro and make it beautiful. 0:15:55 So pretty straightforward. 0:15:57 And yeah, and then it gives me all these options. 0:15:59 So let’s, let’s approve the plan and start. 0:16:05 So it seems like out of the three that v0 gives you the most beautiful stuff out of the box. 0:16:08 But I do really like how it works on replet in terms of 0:16:11 just a whole interface of it telling you everything. 0:16:14 Like maybe for some people this is probably overwhelming 0:16:16 because you actually see the, you see the code flying in here. 0:16:17 It shows you as it’s doing it. 0:16:19 Yeah, I mean, Bolt did that too though. 0:16:24 Bolt, you were able to like basically watch it create every single file and then write the code. 0:16:25 Right. 0:16:27 This looks prettier. 0:16:28 Yeah, this looks prettier. 0:16:32 And I do like the whole interface of like it’s like a project manager 0:16:35 where it’s like asking you the different features you want, right? 0:16:36 Yeah, yeah. 0:16:37 Yeah. And I’ve testing all of them before. 0:16:41 Like I didn’t test Bolt, but I tested v0 and replet. 0:16:42 Like replet’s way more thorough. 0:16:45 Like it’ll give you a whole freaking like login system and stuff. 0:16:46 Yeah. 0:16:49 Yeah. I mean, this is doing the kind of stuff that, you know, 0:16:53 that Riley was showing us with cursor in terms of like creating a database and everything. 0:16:54 Like this is doing all that. 0:16:55 It just created a Postgres database. 0:16:57 So it’s like, it’s incredible. 0:16:59 So it did a lot of stuff. 0:17:01 It like created a database and everything. 0:17:04 Well, it did create a lot of files for you. 0:17:10 You know, maybe since I was sort of giving Bolt two tries, two prompts, 0:17:13 maybe the next best thing is give it like one more prompt, 0:17:18 say like, hey, here’s what’s going on and see if it’ll just fix it for you. 0:17:20 Yeah. So the AI project manager said, 0:17:25 is the website displaying properly with the fantasy theme and navigation working? 0:17:28 I mean, it says the words fantasy. 0:17:30 Is that enough to consider the fantasy theme? 0:17:36 The website loads, but there isn’t any navigation. 0:17:40 You can’t really do anything with it. 0:17:43 I don’t know if that’s enough, but I guess let’s give it like two or three 0:17:45 tries and see where it gets us. 0:17:48 But this is the one I use with my son, 0:17:50 like where we create like a little simple website, 0:17:55 like a giving like tips, tips for Minecraft and League of Legends. 0:18:00 And it, you know, after like five tries got us something that was kind of okay. 0:18:03 Yeah. The other one I was trying to think of was Windsurf editor. 0:18:04 I don’t know if you’ve heard of that one yet. 0:18:08 That one to me looks very similar to Cursor. 0:18:12 It looks very, because it’s also, I believe it’s also a fork of VS code just like Cursor was. 0:18:16 So, so Rip, let’s, I think the agent’s totally confused now. 0:18:20 It’s like, is the website displaying the three fantasy worlds, 0:18:26 Middle Earth, Westeros, and Hogwarts universe with their descriptions and featured articles? 0:18:32 We’re still, so they’ve got some good database that like definitely nothing is being pulled in. 0:18:41 So, I think from all of the various apps that I’ve tested so far between Bolt and VZero, 0:18:48 and the Replit one, I actually think probably the best one at actually making like a fairly 0:18:50 decent looking user interface has been WebSim. 0:18:55 And I don’t think most people actually see WebSim as like a code writing tool, but it totally is. 0:18:58 So, if I use that same prompt that you created, I can come up here. 0:18:59 What would you like to create today? 0:19:01 Let’s go ahead and paste in the same one. 0:19:04 Create me a simple directory website for lore.com. 0:19:07 Make it an educational site about lore of different fantasy worlds, etc. 0:19:11 I’m going to get rid of use Astro because I don’t know what language this, 0:19:14 I think it kind of uses its own thing. 0:19:17 But let’s go ahead and just prompt that and see what WebSim creates for us. 0:19:22 I don’t totally know how WebSim works behind the scenes, 0:19:26 but it is writing a whole bunch of code for us behind the scenes. 0:19:30 We can see it’s still loading, but you can see as I move my mouse around, 0:19:33 it created this like little star field of the background, 0:19:39 lore, explore the vast realms of fantasy, Middle Earth, Westeros, the Witcher’s world, 0:19:40 Hogwarts. 0:19:44 So, it created all of these like descriptions. 0:19:48 One thing that you can do with WebSim too is like not give it any context 0:19:52 and just tell it go like lore.com, right? 0:19:54 If I type this in, it’s not going to go to your lore.com. 0:19:57 It’s just going to create a whole new website called lore.com 0:20:01 and just sort of decide what it wants to be all about. 0:20:04 Maybe they’re getting hung up on this one because like they think they need to create 0:20:06 all this content and that just like gets complicated for it. 0:20:10 Like figuring out all the different images or text or whatever. 0:20:13 So, this is what it thinks lore.com should look like. 0:20:16 Journey through the most captivating fantasy worlds ever created, 0:20:18 discover ancient mysteries, legendary creatures, 0:20:21 and epic tales that shape these magical realms. 0:20:24 And then if I click on begin your adventure, let’s see where it takes us. 0:20:27 That’s kind of fun, like having to imagine what it thinks it would be. 0:20:29 I’m trying to get another one still, like create lore.com. 0:20:30 Like what is that? 0:20:32 It actually kind of created the same content, 0:20:35 Middle Earth, the Witcher’s continent and Westeros. 0:20:39 Well, you can see it created like a fairly simple design, 0:20:42 but if I come up here to these three dots over here on WebSim, 0:20:45 you can actually see there’s a view source button and a download button, right? 0:20:48 I can click on view source and it’ll show me all the code for the website 0:20:51 that I can just copy and paste it to wherever. 0:20:55 Or I can click download and it’ll just I believe download it as a, 0:20:58 yeah, just download it as an HTML file. 0:21:00 So, this will actually write the code for you 0:21:02 and then let you have the code after it writes it. 0:21:04 I’m going to give it another prompt. 0:21:06 Make the links clickable. 0:21:08 You have this. 0:21:10 Okay, so now if I go to lore worlds, 0:21:14 you can see it brought up Middle Earth, Westeros, the continent 0:21:17 and Wizarding World and then it’s got like tags underneath. 0:21:20 Let’s see if I go into Middle Earth, it gives me an error. 0:21:24 So, it’s, I mean, it’s, it gave us a UI. 0:21:27 It’s not giving us like a full working website, 0:21:29 but I mean, it’s one of the better looking UIs 0:21:31 that we’ve gotten out of some of these platforms. 0:21:35 But let’s go ahead and jump over to what you’ve been getting. 0:21:37 So, you said you’ve been working with Vercel. 0:21:42 Yeah, so I tried just what you did like CrateLore.com. 0:21:44 I just wanted to see what it would come up with 0:21:46 if you just literally just put in a domain and that’s it. 0:21:52 So, here when I told Vercel or V0 just to CrateLore.com, 0:21:54 it actually like went and like looked up my website 0:21:57 and like basically copied the style. 0:22:00 I mean, this is not exactly, but this is the same colors, 0:22:02 the same font, the same menu. 0:22:06 Even the same text for the button, you know. 0:22:11 They even got my logo and made something that’s somewhat similar, 0:22:14 which is kind of wild. 0:22:17 So, they’re definitely, you know, you could definitely 0:22:20 look at the style of a website you liked and say, 0:22:23 “Hey, let’s like, you know, create something like that.” 0:22:24 And you could do that. 0:22:26 So, that’s exciting and a little bit scary 0:22:28 that anyone could just probably, you know, 0:22:29 pretty soon just copy your website like that. 0:22:33 So, Bolt, when I said CrateLore.com, 0:22:35 it didn’t look at the existing website. 0:22:37 It definitely like just went off and decided to be creative 0:22:40 and like, you know, decide what it thought that would be. 0:22:43 It’s like, it’s created a course website here, I guess. 0:22:46 It says I have over a thousand active courses 0:22:48 and over 50,000 students, which is great. 0:22:50 Yeah, congratulations. 0:22:52 Yeah, maybe that’s what I should be doing. 0:22:57 And it’s got all these like featured courses down here, 0:22:59 right, which is actually got like images and stuff. 0:23:03 So, it’s pretty, it looks decent, but none of it works. 0:23:06 I mean, so I’m not sure, it is still a little bit hard 0:23:08 to figure out what the actual use case 0:23:09 would be for it right now. 0:23:12 Because if I’m a developer, maybe I would just, 0:23:14 I wouldn’t do this, you know, 0:23:16 I would just use my own templates and stuff 0:23:19 and get started that way or use cursor, probably. 0:23:20 If you were somebody who’s non-technical, 0:23:23 yeah, you could use this, but if it doesn’t actually work. 0:23:26 No, I mean, the thing you showed, 0:23:28 like those simple like Python app, that’s cool. 0:23:29 Like if you can actually make stuff like that 0:23:31 for you that actually works. 0:23:33 It seems like they all sort of still struggle 0:23:35 with creating really decent UIs. 0:23:38 Here’s what I think each of these things are good for, right? 0:23:40 I actually think Bolt is really, really good 0:23:42 for creating simple apps. 0:23:44 Like you can see on my screen here, 0:23:46 this was a prompt, create a simple app 0:23:49 that converts WebP files to JPEGs, right? 0:23:52 One prompt, didn’t have to prompt it again, 0:23:54 and it created it, right? 0:23:55 WebP to JPEG converter, 0:23:57 you drag and drop your files right into here, 0:24:00 it’ll convert the WebP files into JPEGs, right? 0:24:02 If you need a really, really simple app for your business 0:24:04 to just sort of solve a bottleneck 0:24:05 that you’re running into, 0:24:10 you just have like a quick idea of something 0:24:13 that a pretty simple script can solve for you, 0:24:15 Bolt seems really, really good for that. 0:24:18 Replit AI also seems like it probably 0:24:20 be pretty good for that. 0:24:23 We’ve only tried Replit AI on this episode 0:24:26 to try to do like a website design. 0:24:27 We didn’t really try it for a basic app, 0:24:29 but I bet you Replit would probably be 0:24:31 just as good if not a little bit better 0:24:33 than Bolt at creating those apps, 0:24:36 just because it does seem to get your feedback 0:24:37 as you go, right? 0:24:39 It was, you’d probably say make me an app 0:24:43 that converts a WebP file to a JPEG, 0:24:44 and it would probably say, 0:24:48 do you want it to be able to select where it saves to? 0:24:50 Do you want it to be colorful? 0:24:52 Do you want, like it’d probably give you some like 0:24:55 extra prompts that you could answer the questions 0:24:57 so that it gets more detail. 0:24:59 So something like Bolt and the Replit AI 0:25:01 would probably be really, really good for that. 0:25:05 When it comes to like creating user interfaces, 0:25:09 I feel like that’s sort of what V0 was more designed for. 0:25:12 I feel like when I first heard about V0, 0:25:14 like almost a year ago, 0:25:16 that’s what they were really sort of promoting it as. 0:25:19 They were promoting it as like a really good tool 0:25:22 to create some like solid user interfaces. 0:25:24 And of all the tools that we used, 0:25:26 that was the one that it was clean. 0:25:28 It was really simple. 0:25:29 You said it had some like animated graphics 0:25:31 when you first loaded the page. 0:25:34 Like that was the one that seemed to sort of nail 0:25:36 a UI right out of the box. 0:25:37 The very first time you tried, 0:25:40 the UI looked pretty decent right out of the box. 0:25:44 And then the craters of the V0, Vercel, 0:25:46 they’re also the craters of Next.js, 0:25:49 which is like become the most popular framework 0:25:51 for creating apps, web apps. 0:25:54 So I think where V0 probably would be the best 0:25:57 is if you are a developer and you’re creating a new website, 0:25:58 this is probably a great way 0:26:01 to like jump-start the website’s development, right? 0:26:03 Versus like people before might just have a template, 0:26:04 but sometimes a template doesn’t fit 0:26:05 what you’re trying to do now. 0:26:07 So instead of doing that, 0:26:09 maybe this replaces having your own template. 0:26:12 You go in there, describe what you’re going to build, 0:26:14 have it build the basic frameworks, 0:26:17 the basic layout and make it look decent. 0:26:18 And then you go in there and actually add the content 0:26:20 and make it do what you want it to do. 0:26:22 Replit, AI agent, even though it probably had 0:26:25 the worst design, but I do see like possibly 0:26:27 the most potential in how it’s doing things 0:26:29 in terms of you’ve got this AI chatbot 0:26:30 that you keep going back and forth with. 0:26:32 So instead of having to go code it yourself 0:26:34 with like cursor or whatever, 0:26:36 like you really can just kind of go back and forth, 0:26:37 like here’s the project, here’s, 0:26:40 and then I love it to ask you what features you want. 0:26:41 Yeah, yeah. 0:26:42 It didn’t work perfectly, 0:26:45 but I did do this with my son and it did actually work 0:26:48 where you could like log in and create a database 0:26:50 and you had like a login system and everything. 0:26:52 I think eventually for like more serious projects, 0:26:55 I think the Replit way is the interesting way 0:26:57 to approach this where you’ve got a project manager 0:26:59 that you chat with and make sure you’re getting 0:27:00 what you want and go back and forth 0:27:02 and then you eventually land on, 0:27:05 making sure you’ve got all the features you need. 0:27:07 And then WebSim, I’ll just say one thing about WebSim. 0:27:10 WebSim I feel like is really good is if for like, 0:27:13 maybe you have a domain name that you don’t know 0:27:16 what to do with, go plug the domain name in 0:27:18 and let it figure out what the site’s going to be for you 0:27:21 because that’s where WebSim I think like really shines 0:27:26 is just like being that sort of creative brain for you, right? 0:27:28 I type in lore.com. 0:27:31 It has no idea what your lore.com is, 0:27:33 but it went and created a site about Middle Earth 0:27:36 and Westeros and the continent and the wizarding world 0:27:40 and apparently I can search realms and creators and eras 0:27:43 and so it gave me an idea of what lore.com could be. 0:27:45 So if I have a domain, I’m like, I don’t know what to do with this. 0:27:49 This is a nice little idea generator 0:27:52 that’ll give me a little muck up, a little concept that I can use. 0:27:54 All of these let you download the code. 0:27:57 All of them, you know, you can go and put it on GitHub. 0:27:59 You can download it to your computer. 0:28:01 You can, you know, go and do whatever you want 0:28:02 with the code once it’s created, 0:28:06 but they all seem to have like their little pros and cons. 0:28:08 This is already so much better than a year ago. 0:28:10 So I do believe that when you combine that 0:28:13 with the fact that the underlying models are getting better, 0:28:15 that, you know, let the reasoning models 0:28:16 and things like that coming out from OpenAI, 0:28:18 they’re probably in a year. 0:28:20 These are going to be very, very good. 0:28:21 Like it’s going to surprise people. 0:28:23 And so if you’re in business, 0:28:24 I would be thinking about what that means 0:28:25 in terms of the opportunities. 0:28:27 Like what are you going to be able to do 0:28:29 with this new technology once it gets that good? 0:28:31 And also in terms of, you know, defending your business, 0:28:32 like what does it mean 0:28:34 when people can copy your website or whatever? 0:28:37 Yeah, I mean, you know, from a business perspective, 0:28:42 I think probably any of these tools are going to be fairly decent 0:28:44 at creating like little mini web apps 0:28:46 that you can get people to opt in for 0:28:48 if you’re just kind of giving away access for free. 0:28:54 So, you know, that’s where I feel like the state of AI is. 0:28:56 I think if I’m being honest right now, 0:29:00 I’m probably still going to use cursor more than anything else. 0:29:04 I still really like the sort of VS Code fork. 0:29:05 I like that look. 0:29:07 It’s familiar to me. 0:29:09 I kind of understand it. 0:29:12 Like I’m probably still going to use cursor most of the time. 0:29:16 But for people that, you know, they don’t want to see all that code. 0:29:18 They don’t want, like they just want to type a prompt 0:29:20 and have the code written for them. 0:29:22 These are some alternatives that are out there 0:29:23 that you can go play with. 0:29:25 They’re getting better and better and better, 0:29:27 like practically by the day. 0:29:30 You know, by the time this episode is even live, 0:29:33 they’re probably better than what we just showed off in this episode. 0:29:35 So it is still pretty impressive. 0:29:39 And I think, you know, a year from now, 0:29:42 these apps are going to be like absolutely mind blowing. 0:29:45 Like you probably will be able to prompt a whole SaaS company 0:29:46 into existence, right? 0:29:50 Like I think that’s probably going to happen by like next year. 0:29:52 So really, really exciting advancements. 0:29:54 Really fun to play with. 0:29:56 I mean, like you can lose yourself for hours 0:29:59 just playing with these and going that didn’t work, fix it, 0:30:01 and then eventually kind of get it. 0:30:02 And there’s something fun about that. 0:30:06 Like I don’t know why, but like creating a script 0:30:08 and getting it to generate something for me 0:30:10 and then going it didn’t work. 0:30:11 Maybe we try this. 0:30:14 That worked, but also now I want to add this feature 0:30:17 and just sort of having these conversations back and forth 0:30:20 and seeing like a concept that you had in your mind flesh out 0:30:23 into something that’s actually usable. 0:30:24 Like it’s a dick thing. 0:30:25 It is really, really, really fun. 0:30:28 Well, that’s always been the most exciting thing about coding 0:30:30 that most non-coachers don’t realize. 0:30:32 Like that is the magic of like you got something in your head. 0:30:35 Now you go code it and then you make it real. 0:30:38 And for a lot of people who were not super obsessed with coding, 0:30:40 the whole part in the middle is not actually 0:30:41 the interesting part. 0:30:42 Yeah. 0:30:44 The part that’s interesting is like idea it’s real. 0:30:47 That’s the part that’s super addicting. 0:30:49 And so many people have never experienced that. 0:30:52 So I think that’s a big unlock for people is like 0:30:55 even people who have no technical skills soon will be able 0:30:58 to like go through that kind of creative process of like, 0:31:00 and you’ll be able to do the designs and stuff, right? 0:31:02 Like the design part will get baked in this too. 0:31:04 We’re like, okay, let’s flesh out like the color scheme. 0:31:06 Like you can imagine like one way like, 0:31:07 let’s figure out your color scheme first. 0:31:10 Like let’s, okay, you like these kind of pictures? 0:31:11 Okay, cool. Here’s your color scheme. 0:31:13 Here’s like your five colors, main color, secondary, 0:31:14 all this kind of stuff. 0:31:14 Accident. 0:31:16 Yeah. 0:31:17 That’s where this is all going to go, 0:31:17 where you’re going to be like, 0:31:19 it’s going to be an entire creative process 0:31:20 and you’ll be able to go through the entire thing 0:31:23 with the help of AI and make things how you want to. 0:31:24 And that’s so exciting. 0:31:26 So really, really exciting stuff. 0:31:28 We’re going to be experimenting with more tools. 0:31:31 Maybe a year from now we’ll do another episode 0:31:32 where we play with these exact same tools 0:31:34 and literally see how far they’ve come 0:31:38 because they will probably be unrecognizable by then. 0:31:40 So we’ll have to revisit that in a little bit. 0:31:42 But for this episode, 0:31:44 I think we’ll go ahead and close this one out. 0:31:46 Thank you so much for tuning in. 0:31:49 Make sure you subscribe to us wherever you listen to podcasts. 0:31:51 This one was a very visual episode. 0:31:52 So make sure you check it out on YouTube 0:31:53 if you’re listening to the audio 0:31:56 because we were showing a lot on our screen. 0:31:58 But also make sure you subscribe to us 0:31:59 wherever you listen to audio. 0:32:01 We’re on all of the audio platforms. 0:32:03 Thank you so much for tuning in. 0:32:04 We’ll see you in the next episode. 0:32:06 (upbeat music) 0:32:09 (upbeat music) 0:32:12 (upbeat music) 0:32:14 (upbeat music) 0:32:17 (upbeat music) 0:32:20 (upbeat music) 0:32:22 you
Episode 35: How effective are the latest AI tools for coding and app development? Matt Wolfe (https://x.com/mreflow) and Nathan Lands (https://x.com/NathanLands) take us through a head-to-head evaluation of Bolt, v0, Replit, and Websim.AI, analyzing their design, usability, and overall effectiveness.
In this episode, the hosts dive deep into the capabilities of each AI coding tool, sharing personal experiences and testing outcomes. From creating a Space Invaders game with Bolt to building a directory website for lore.com with Replit, they explore how these AI tools can transform the app development landscape. They also discuss the future of AI in enabling non-technical users to engage in creative design processes and the potential for AI to automate entire SaaS companies within the near future.
Check out The Next Wave YouTube Channel if you want to see Matt and Nathan on screen: https://lnk.to/thenextwavepd
—
Show Notes:
(00:00) Using AI tools to clear business bottlenecks.
(04:53) Using mini apps to streamline business processes.
(06:39) Dall-E images converted to JPEG effortlessly.
(11:24) Bolt struggles with image duplication, excels verbally.
(17:15) Websim excels in creating user interfaces.
(20:20) AI can replicate website design accurately, surprisingly.
(23:02) Replit AI excels in web design and apps.
(25:46) Websim creatively designs websites for domain ideas.
(29:07) Turning ideas into reality through coding.
The Next Wave is a HubSpot Original Podcast // Brought to you by The HubSpot Podcast Network // Production by Darren Clarke // Editing by Ezra Bakker Trupiano
0:00:02 I think a lot of people have this misconception right now. 0:00:09 For anybody to think that we’ve hit the end of the road with AI is delusional. 0:00:14 I don’t know the right word, but we’re closer to the one yard line than we are to the 99 yard line. 0:00:17 We’re at the beginning of an exponential curve. 0:00:20 We’re not plateauing. We’re literally right here. 0:00:27 We’ve seen the most insane advancements in technology the world has ever seen in human history. 0:00:31 And we’re going to get to witness AGI and most likely ASI in our lifetimes. 0:00:33 Like that to me is mind blowing. 0:00:39 Hey, welcome to the next wave podcast on Mount Wolf. 0:00:42 I’m here with Nathan Lanz. 0:00:45 And today we’re talking about some really important topics. 0:00:47 The world has recently changed. 0:00:50 We just had an election here in the US and Donald Trump was elected. 0:00:56 And we’re going to spend a big chunk of this episode talking about what that actually means for the world of AI. 0:01:04 We’re also going to talk about how there’s been a major shift in the way Silicon Valley’s optimism has been towards tech and AI. 0:01:10 And we’re going to give you some of the predictions that we have of where AI is going in 2025. 0:01:11 You’re not going to want to miss this one. 0:01:12 So let’s just jump straight in. 0:01:19 When all your marketing team does is put out fires, they burn out fast. 0:01:25 Sifting through leads, creating content for infinite channels, endlessly searching for disparate performance KPIs. 0:01:26 It all takes a toll. 0:01:30 But with HubSpot, you can stop team burnout in its tracks. 0:01:34 Plus, your team can achieve their best results without breaking a sweat. 0:01:40 With HubSpot’s collection of AI tools, Breeze, you can pinpoint the best leads possible. 0:01:46 Capture prospects attention with clickworthy content and access all your company’s data in one place. 0:01:48 No sifting through tabs necessary. 0:01:50 It’s all waiting for your team in HubSpot. 0:01:54 Keep your marketers cool and make your campaign results hotter than ever. 0:01:57 Visit hubspot.com/marketers to learn more. 0:02:04 Let’s just go ahead and get into it. 0:02:08 I know, Nathan, you’ve had a lot on your mind about this stuff. 0:02:10 Where do you think’s a good starting point for this? 0:02:13 Yeah, I mean, maybe give you a little bit of this claim or like, you know, 0:02:15 this is not going to become a political podcast. 0:02:20 And I understand this whole topic is quite controversial because some people love Trump. 0:02:21 Some people hate him. 0:02:25 I absolutely hate him, but I think it’s going to be kind of hard to avoid this topic because moving forward, 0:02:29 obviously, Trump and Elon Musk are going to be pivotal in a lot of the changes 0:02:31 that are going to happen with AI and technology. 0:02:35 So it’s going to be impossible not to discuss the stuff that they’re doing. 0:02:39 And something I’ve noticed is, you know, I kind of confided to you recently. 0:02:44 You know, I’m kind of a ex-liberal lived in San Francisco for 13 years, 0:02:48 fell out of love with a lot of the left-wing policies because of the stuff I saw there. 0:02:52 And I have noticed that, like, even with all of my left-wing friends in Silicon Valley right now, 0:02:54 there’s a major vibe shift happening. 0:02:57 Like, like even my left-wing friends are like admitting like, oh, yeah, 0:02:59 the mood has dramatically changed. 0:03:03 It’s went from being very pessimistic and scared of the government to thinking like, 0:03:05 holy crap, it was always a talk in Silicon Valley. 0:03:07 Like, why do we not have any smart people in the government? 0:03:08 Why is that? 0:03:12 Like, why don’t we actually get Silicon Valley like some of the smartest people involved in the government? 0:03:14 It’s like, well, that is now happening here, right? 0:03:20 Or are we sending our dumbest people to the government or is it the smartest people don’t want to work for the government? 0:03:21 Yeah, exactly. 0:03:23 But it’s like, now there’s this moment where it’s like, okay, like Elon Musk. 0:03:26 It’s like, well, even people don’t like his politics. 0:03:28 It’s hard to argue that he’s not intelligent. 0:03:31 Like, you can’t build those kind of companies that he has without being incredibly smart. 0:03:35 And then you’ve got Vivek, who’s also a highly intelligent guy. 0:03:38 And you got JD Vance, who used to be a venture capitalist, right? 0:03:42 So you’ve got people who actually understand technology highly involved in the government now. 0:03:46 And the big thing they just announced is, you know, Doge, which I guess you didn’t know. 0:03:48 Like, a lot of people don’t realize like it’s not just the meme coin. 0:03:53 It’s like, it’s a Department of Government Efficiency is what this stands for. 0:03:54 And it actually started out as a joke on X. 0:03:59 Like somebody tweeted like, oh, you should create the Department of Government Efficiency and call it Doge. 0:04:01 And it literally started from a joke. 0:04:03 And now it’s a real thing. 0:04:04 And this is actually why a lot of people are excited. 0:04:06 Like, you know, because it’s obvious. 0:04:10 Like, you know, he’s interacted with the government, like realizes, you know, it’s almost like when you go to the hospital, 0:04:14 where like they charge you $1,000 for like a straw and stuff like this, right? 0:04:20 Like the government is full of this kind of stuff where people are just like throwing away money because it’s not theirs. 0:04:21 It’s a taxpayer’s money. 0:04:24 And I think a lot of that on both sides is happening. 0:04:29 So people are excited because the idea is they’re going to look through, you know, how the government’s spending money. 0:04:35 And I predicted on X yesterday that Elon Musk would probably be important for this because people on both sides, 0:04:40 including Republicans, are going to hate this because, you know, you don’t want for people to know how much money is wasted. 0:04:44 And it’s going to be important for Elon Musk to be involved because he can make that transparent. 0:04:46 And then it’s going to be hard for him to hide from that, right? 0:04:51 Like if you make a transparent X show where the money is being spent, what can they do about that? 0:04:54 And so Elon Musk announced yesterday like, yeah, that is what he’s doing. 0:04:59 He’s going to make a public leaderboard where he’s going to show where the money is being spent. 0:05:01 And then they’re just going to cut it out. 0:05:06 And then Vivek, he shared something earlier because people are saying like, oh, they’re not going to have any power to do any of this. 0:05:11 It’s all just talk. And Vivek’s like shared all these like Supreme Court cases that seem to say otherwise, 0:05:17 seem to say that there’s like going to be a legal precedent that like the government agencies have already overstepped their bounds. 0:05:21 There’s Supreme Court rulings that seem like you could cut a lot of the agencies down. 0:05:23 And so I think that’s what he’s going to try to do. 0:05:27 He’s going to try to do like, you know, with like Twitter where, you know, he cut Twitter down by 80%. 0:05:30 Sure, there were some problems along the way. Of course, there’s problems when you change things like that. 0:05:36 But overall, the company is doing way better now and it has like 80% less people. 0:05:39 And so he’s convinced with the government, you’re going to be able to do the same kind of thing. 0:05:43 The current plan for my understanding is that he’s going to give people a two-year severance pay. 0:05:50 So they’re talking about possibly reducing the size of the government by like anywhere from 30% to 80% 0:05:51 and giving everyone two-year severance pay. 0:05:56 And then, hey, you should go, you know, working some new job in AI or tech or whatever, find a new job. 0:06:02 Right. And so instead of spending all that money, if you could actually invest that money into the U.S. 0:06:05 and making sure that the U.S. is number one, that could change everything. 0:06:08 I think that people are not realizing that like all that money is being wasted. 0:06:13 If you reinvested it in new things in America, that you could really change the country, you know? 0:06:16 And I shared a tweet on X or post or whatever they’re calling. 0:06:19 Maybe like a month ago, they got like 47 million views. 0:06:23 Talk about how I went from Japan to America and solve the problems in America, right? 0:06:27 Like, yeah, America is the best country in some ways, but there’s a lot of things that don’t work well. 0:06:32 Like in compared to Japan, like you go to America and things just don’t work, right? 0:06:35 And it feels like a lot of that is because we waste money on all these stupid things 0:06:38 where we could be investing that money into infrastructure and new technology. 0:06:43 And so that’s why I’m personally excited because I think this could possibly be like a golden age for America 0:06:46 where like we actually start investing in the country again. 0:06:49 And they also proposed building 10 new cities in America. 0:06:54 Since Elon Musk is going to be involved in that, 10 new cities that are going to be obviously infused with like AI. 0:06:58 You’re going to have like robot cars, you’re going to have, you know, truck pop flying cars. 0:07:01 I don’t know if I actually like that idea of flying cars, honestly. 0:07:06 But you’re going to have like 10 new cities that are going to be built from the ground up with AI and technology in mind. 0:07:10 So just imagine instead of wasting two trillion dollars on stuff that probably doesn’t matter. 0:07:14 It’s like people who are just doing paperwork all day that actually typically slow down companies. 0:07:20 Instead of that, you put that two trillion a year into building 10 new AI powered cities. 0:07:22 Like America would look dramatically different. 0:07:24 And so I think people who understand that, that’s why there’s a vibe shift. 0:07:28 And I’m happy to see that even like moderate left wing people are like, I don’t like Trump, 0:07:31 but the idea of Dove is amazing, excited for it. 0:07:33 So yeah, no, that’s, that’s really interesting. 0:07:37 I think you dropped like five or six different things there that all be like sort of talking voice 0:07:39 that we can go down the rabbit hole on. 0:07:43 But yeah, I mean, I keep on seeing this Doge thing on X and every time I saw it, 0:07:48 like I almost sort of skimped past it because I thought it was talking about the crypto thing. 0:07:51 And, you know, I’m like somewhat interested in crypto. 0:07:53 I hold some Bitcoin and stuff. 0:07:57 This, you know, which by the way is another whole story there with Bitcoin exploding. 0:08:02 But you know, I hold a little bit of crypto, but I don’t do the Doge thing. 0:08:04 Never really been into the mean coin thing. 0:08:07 So every time I see Doge, I just sort of scroll past it thinking, oh, 0:08:09 this is just another thread of somebody talking about crypto. 0:08:12 So it’s, you know, not something that’s really on my radar. 0:08:18 But now that I know that it’s actually referring to the department of government efficiency. 0:08:18 Yeah. Yeah. 0:08:22 Now that I know that I’m going to actually start paying a little bit more attention to it. 0:08:23 Yeah. I think it’s kind of, yeah. 0:08:27 So maybe that’s distracting that the name it, but also I think it probably got more attention to it. 0:08:30 Right. Because like, obviously people in the media are like super upset. 0:08:31 Like this is ridiculous. 0:08:35 He’s like calling it Doge and like there’s like an official Doge job. 0:08:40 You know, we’ve already got a Doge coin, which is apparently unrelated completely. 0:08:41 Maybe they’ll tie together. 0:08:42 Who knows? 0:08:43 Yeah. Who knows? 0:08:45 I mean, you know, Doge has always been a meme, right? 0:08:49 Like I hung out with Jackson Palmer when he moved from Australia to San Francisco back 0:08:52 of the day when he first started Doge coin as a joke. 0:08:53 You know, really fun guy. 0:08:57 But yeah, the whole thing was a joke and now it’s become a, you know, a whole thing. 0:08:57 Yeah. Yeah. 0:09:00 I really like the sort of leaderboard idea. 0:09:01 I can’t visualize it. 0:09:06 I have no idea what something like that would look like, but I really like that level of transparency 0:09:10 where like anybody can go and be like, wait, we’re putting this much money towards this thing. 0:09:13 Like why this leaderboard is out of skew? 0:09:15 We need to adjust this a bit. 0:09:16 I love the idea. 0:09:21 Like I tweeted, I think like a year ago, like one of my big predictions was that in the next year or two, 0:09:24 we would see AI start to have an impact on government spending. 0:09:29 You know, this might be controversial, but you know, my belief is that Democrats and Republicans both, 0:09:32 there’s a lot of corruption in my opinion. 0:09:35 I think there’s a lot of people overpaying their friends, companies, and then later on, 0:09:37 they get favors and things like this. 0:09:42 And I think that there’s so much complexity that it’s hard for the average person to understand that or see it. 0:09:47 And I’m pretty convinced that once you start applying AI to looking at all the government spending data, 0:09:50 there’s going to be some things that kind of pop out like, oh, 0:09:55 why are we paying a million dollars for this, you know, for, you know, whatever? 0:09:59 Like for that kind of shovel or whatever thing it is, you know, like I think there’s like so many examples like this of like, 0:10:01 just dramatically overpaying for things. 0:10:03 And so I’m excited. 0:10:07 I think in probably 12 years, we’ll be looking at AI or maybe in less than 12 years, 0:10:10 like AI will be like almost like part of the government where it’s like showing us like, 0:10:11 here’s how this money’s been hit. 0:10:12 We could spend it more efficiently. 0:10:18 It’s all about how you can spend more efficiently, more intelligently versus what I think is happening right now. 0:10:20 Like I love people just overpaying their friends and things like that. 0:10:25 One thing I do want to dive into a little bit is so you’ve already sort of broken out a whole bunch of potential 0:10:29 implications of like the new administration coming in, right? 0:10:34 There are a few other things I know that like Trump basically said that on day one, 0:10:36 whether this actually happens on day one or not, 0:10:41 I feel like politicians saying I’m going to do this on day one is sort of like a talking point. 0:10:46 It’s sort of like high schoolers running for president saying I’m going to make all the vending machines free. 0:10:48 Like whenever I hear day one, that’s kind of how I feel about it. 0:10:50 Is there just saying stuff people want to hear? 0:10:56 But he did say that on day one, he wants to repeal Biden’s AI executive order, 0:11:03 which essentially Biden’s executive order like created a new form of government to sort of look at AI. 0:11:10 And also there was something in there that said that pretty much any foundation model had to be seen 0:11:14 by the government and approved by the government before it can sort of get released into the wild. 0:11:18 Those were like kind of the two main things of the executive order. 0:11:24 Yeah, government approval first and also like a government body to sort of keep track of AI. 0:11:26 And Trump said, I’m going to repeal that right away. 0:11:30 We want companies to be able to move as fast as possible when it comes to this stuff. 0:11:35 They shouldn’t need to like run it by their parents first, right? 0:11:37 That’s one of the big implications. 0:11:41 Also, we got JD Vance, which you mentioned he was a venture capitalist. 0:11:45 But one of the things that he’s been fairly outspoken about is open source. 0:11:52 He actually has made a lot of statements in the past about how he’s worried that regulation for AI 0:11:59 within the government is going to sort of strongly favor the existing incumbents 0:12:02 and make it really, really difficult for new players to get in, right? 0:12:06 Because what ends up happening is you get these big massive companies, 0:12:11 the Googles, the Metas, the Microsoft companies like this that have more money than God 0:12:17 and they can lobby the politicians to get the regulations to sort of go in their favor. 0:12:21 And a lot of times those regulations go in these big incumbents favors, 0:12:25 but the open source, the smaller companies that are just trying to get going, 0:12:29 they severely hinder those companies progress. 0:12:33 And that’s something that JD Vance, the vice president elect, 0:12:37 essentially said he’s worried about with AI regulation. 0:12:40 We need to make sure that whatever sort of things we do, 0:12:45 whatever sort of moves the government makes, they help, you know, both sides, right? 0:12:50 It’s not just completely favoring the massive incumbent companies. 0:12:54 So those are a few of like the implications that we’ve heard already. 0:12:58 Another thing is that Trump basically said he wants to make US first in AI, right? 0:13:03 You know, he sees it as a competition with China and I’m sure there’s some other countries in the mix, 0:13:07 but for the most part, when you talk about AI, you’re usually talking about the US and China 0:13:10 are the two that seem to be like racing each other the most. 0:13:15 Yeah, I think that the concerns JD Vance has shared like actually are really similar to mine. 0:13:21 Like in terms of in the future, AI is going to be the main intelligence on planet Earth. 0:13:27 So it’s very dangerous for that to be one company owning that because then that’s one company that owns intelligence. 0:13:31 One company that owns all sources of like what is the truth, right? 0:13:33 Obviously, that’s dangerous for one company to own. 0:13:35 So a lot of his concerns are around that. 0:13:38 And so that’s why he’s a big proponent of open source, which is exactly the same as me. 0:13:41 Like, I think we can’t have just one company owning intelligence. 0:13:44 That’s like, obviously, a very bad idea for humanity. 0:13:49 But at the same time, he does seem to be very practical of like you mentioned being concerned about China 0:13:52 and realizing we are in a new arms race with China, right? 0:13:55 This is like building the nuclear bomb or building the internet or whatever. 0:14:00 Like in new technologies, America has stayed ahead because we were the number one in those areas with AI and robotics. 0:14:02 Now we have to be number one. 0:14:05 And it looks like right now we’re ahead in AI and China is ahead in robotics, right? 0:14:06 Right. 0:14:09 That’s concerning. 0:14:13 It’s good that we’re ahead in AI though, because that should help us go ahead in robotics in the future. 0:14:14 But currently, that’s not how it’s playing out. 0:14:17 Currently, China seems to be ahead in robotics. 0:14:18 And so I’m pretty sure that they’ll be practical. 0:14:21 They’re not going to be like, hey, everything has to be open source. 0:14:24 I think they’re going to be very supportive of being open AI and all these other different companies. 0:14:27 I don’t think it’s going to be all about X AI or whatever. 0:14:29 I don’t think they’re going to just favor Elon Musk. 0:14:33 And so I think I have a practical approach where it’s like, okay, there’ll be some regulation around AI, 0:14:38 but definitely not highly restrictive because I do not want to slow down American companies in terms of competing with China. 0:14:39 Yeah, yeah. 0:14:40 That’s the impression I get. 0:14:44 We’ll be right back. 0:14:47 But first, I want to tell you about another great podcast you’re going to want to listen to. 0:14:51 It’s called Science of Scaling, hosted by Mark Roberge. 0:14:57 And it’s brought to you by the HubSpot Podcast Network, the audio destination for business professionals. 0:15:04 Each week, host Mark Roberge, founding chief revenue officer at HubSpot, senior lecturer at Harvard Business School, 0:15:09 and co-founder of Stage 2 Capital, sits down with the most successful sales leaders in tech 0:15:14 to learn the secrets, strategies, and tactics to scaling your company’s growth. 0:15:19 He recently did a great episode called How Do You Solve for a Siloed Marketing in Sales? 0:15:21 And I personally learned a lot from it. 0:15:23 You’re going to want to check out the podcast. 0:15:26 Listen to Science of Scaling wherever you get your podcasts. 0:15:31 I tend to sort of avoid politics. 0:15:35 I don’t really identify as a Republican or a Democrat like never in my life. 0:15:37 Have I identified with like either party? 0:15:40 I’ve always sort of identified with I’m an entrepreneur. 0:15:47 I take care of myself, no government entity, no one person getting elected is going to dramatically change my life. 0:15:50 It’s up to me to change my life and get to where I want to get. 0:15:54 And so I’ve always kind of had like that sort of perspective on politics. 0:16:02 But saying all of that, the comment that I was about to make is that it does seem like as far as like AI progress goes. 0:16:05 I’m not going to speak to the character of the candidates or anything like that, 0:16:12 but as far as like which candidate is going to help us get ahead in AI faster and get further with it. 0:16:17 I think the outcome of that election is going to get us further in AI, right? 0:16:19 That’s kind of how I feel about that. 0:16:25 There’s, you know, other things that I do like other things that I don’t like about, you know, both candidates that we’re running, 0:16:26 but we don’t need to get into any of that. 0:16:32 Let’s sort of like shift the topic slightly here because Gary Tan just interviewed Sam Altman. 0:16:39 And during that interview with Sam Altman, one of the questions he asked him is what are you most excited about for 2025? 0:16:42 And Sam Altman’s response was AGI. 0:16:43 I think that’ll be pretty cool, right? 0:16:46 Like, I think that was his words exactly. 0:16:52 So that’s basically Sam Altman implying that AGI is coming in 2025. 0:16:58 What’s interesting about that is there’s also been articles that recently came out from the information and from Bloomberg 0:17:05 and then pretty much all the other news outlets sort of covered what these two original news sources covered, 0:17:09 which is that they are both claiming that AI has slowed way down. 0:17:16 So to hear Sam Altman go and talk to Gary Tan on the Y Combinator podcast and say we’re gonna have AGI in 2025 0:17:23 and then seeing all these other news outlets saying we’re hitting this AI winter in progress is slowing way down. 0:17:26 Those two things seem to be at odds with each other a little bit. 0:17:30 Yeah, I mean, like from talking to friends in Silicon Valley, like everyone’s very optimistic. 0:17:37 And so, and in general, Silicon Valley does have like insider information like YC is the biggest network in Silicon Valley. 0:17:44 Sam Altman used to run YC, former YC, you know, and so they typically know about things going on in OpenAI before others do. 0:17:46 And everyone’s very optimistic. 0:17:51 So that tells me that I would trust what Sam is saying about AI over other people, quite honestly. 0:17:56 Can we talk about this before, like we were out in Boston talking about how reasoning models like 01, it’s a big deal. 0:18:02 The fact that you can now, even without more data, you can still scale AI just by throwing more compute at it. 0:18:06 And instead of being based on the data, it’s based on when it’s actually thinking about what you say to it. 0:18:10 And recently, people inside of OpenAI have started sharing comments on this. 0:18:18 Like, hey, yeah, people have not properly updated their thoughts on where AI is heading based on what 01 means. 0:18:21 Because when people try to explain 01 as well, people are like, oh, it’s just like a chain of thought. 0:18:24 And that’s all it is. It’s basically what people were doing before. 0:18:27 People at OpenAI have said like, no, that’s like kind of like what inspired it. 0:18:30 But there’s more obviously going on behind the scenes. 0:18:32 It’s not just chain of thought. 0:18:38 And so what surprised them is it seems to reflect almost like an internal monologue the AI has now. 0:18:42 And they said some of the internal monologue that they see has been kind of shocking. 0:18:49 And I’ve been hearing rumors too, like by the time this episode is out, we might have 01 full version, right? 0:18:54 Right now we’ve got OpenAI 01 me and OpenAI 01 preview, right? 0:18:58 So we haven’t even seen the OpenAI 01 model yet. 0:19:02 What we’ve seen is sort of like a not fully trained version. 0:19:08 It’s almost like a checkpoint version of what the full 01 was going to be. 0:19:16 And there’s been a lot of rumors circulating over on X and various news websites that claim that in November, 0:19:18 we’re going to see the full version of 01. 0:19:22 So within the next couple of weeks, we’re likely to see the next version of 01 0:19:25 and it’s probably going to blow some people’s minds. 0:19:27 You know, we also had Dario Amadei. 0:19:29 I’m not sure if I pronounced his name right. 0:19:31 He’s the CEO of Anthropic. 0:19:33 He went on the Lex Friedman podcast. 0:19:36 Lex asked him the same question. 0:19:38 When do you think we’re going to get AGI? 0:19:44 He basically said, I believe it’s going to be in, you know, 2026 or 2027, but probably not sooner than that. 0:19:48 And then you have Sam Altman saying he believes it’s going to be 2025. 0:19:53 But I think a big thing that’s going on here and the information article and then Bloomberg covered it 0:19:55 where they said things are really slowing down. 0:20:01 What we’re seeing happen is that the training side is slowing down, 0:20:06 but what they’re able to do on the inference side is getting better and better and better. 0:20:10 And I think that’s what you were just kind of saying there with the like reasoning model with 01. 0:20:14 So yes, we’re sort of running out of data to train these AI models on, right? 0:20:19 Like once you’ve scraped the whole internet and traded into an AI model, like what are you scraping from beyond that? 0:20:25 It’s either got to be synthetic data or you’re just scraping the same thing over and over again. 0:20:30 You know, I think where people are probably getting confused too is like they saw 01 preview and they’re like, 0:20:34 oh, it’s kind of slow and it’s not better in a lot of ways. 0:20:38 But they’re not realizing like this is a new, you know, kind of overusing the word, but it’s a new paradigm. 0:20:40 But it’s a new way to build AI. 0:20:42 And so of course, the first version is not that great. 0:20:44 But it’s a matter of GPT one version, right? 0:20:46 They started the whole naming convention over. 0:20:47 That’s why, right? 0:20:53 And then people on open AI have like commented recently, like they’re not paying attention to how the trajectory is going to change. 0:20:55 Like the trajectory of AI, how fast things improve. 0:20:57 That’s what matters. 0:21:02 And with these new models, if you don’t have to worry about like you just trained an entirely new model and you got all this new data, 0:21:06 and there’s all these different projects, you know, it’s like it was like a nine months or a year. 0:21:09 Like they said, like for some of the models, how long it took to train everything and test it. 0:21:16 If you’re not waiting on that instead, every single day, you’re throwing more compute at this model and improving it every single day. 0:21:18 That’s probably where we’re at or heading. 0:21:22 And so that’s going to be really different than, oh, yeah, we get an upgrade every nine months. 0:21:26 We’re probably going to be like, oh, we get an upgrade every week and improvements may like speed up. 0:21:27 And so that’s what people are not thinking about. 0:21:32 Like we’re probably moving from a world of like updating every nine months, like updating every week and that’s going to change things. 0:21:36 Yeah, and eventually updating every day and then eventually updating in real time, most likely. 0:21:41 Like I think a lot of people have this misconception right now that a lot of these AI models, 0:21:46 when you’re sitting there having a conversation with it, it’s actually learning and like training on the conversation. 0:21:48 But that’s not how it works, right? 0:21:55 Like that’s why when you see a new model of GPT, it says like framed through, you know, June 2023 or, you know, 0:21:59 this model was updated on August of 2024 or whatever. 0:22:03 And like I have conversations with people sort of outside of the AI sphere all the time, right? 0:22:07 Whenever I’m, you know, hanging out with friends or family that don’t know much about AI, but they know what I do for a living, 0:22:10 they always want to ask me about it, right? 0:22:15 And the conception that most people have is that if I go and have a conversation with chat GPT, 0:22:19 it’s instantly getting smarter and smarter and smarter. 0:22:23 And if I corrected on things, the correction that I gave it is now going into the training. 0:22:24 That’s not how it works, right? 0:22:28 They’re getting updated and there’s like new training runs all the time. 0:22:33 But the conversations you’re having with it aren’t actually updating the model. 0:22:35 Saying that, I think that’s eventually where it’s going to get to. 0:22:38 I think it is going to get to a point where in real time, 0:22:43 the models are sort of getting smarter and smarter and smarter based on the conversations they’re having. 0:22:48 And no, like one person is going to be able to totally screw over the model 0:22:52 by giving it a whole bunch of fake information and then assuming it’s going to get trained into the model, 0:22:58 because I would imagine it’s going to have some sort of system where it’s looking at all of the information in aggregate. 0:23:04 And when a certain specific information is fed multiple times over and over again, 0:23:06 that’s the information that’s going to get updated and fixed. 0:23:11 But if somebody is going in there saying, here’s how many rocks you should eat on a daily basis, 0:23:13 not necessarily going to update with that information. 0:23:17 It’s sort of going to look at the aggregate of everybody communicating with it. 0:23:19 That’s where I think it’s going to get. 0:23:22 I posted something on Twitter the other day about how like, 0:23:26 it feels like there’s been less exciting things in AI lately. 0:23:28 And somebody put a comment on there saying, 0:23:34 what you’re failing to realize is that the big monumental shift that AI was going to generate 0:23:39 has already happened and there’s no more progress from here. 0:23:41 And my response to that was just false, right? 0:23:45 Because I could literally go on and on and on about all of the stuff that’s coming. 0:23:49 I mean, I’ve even got NDAs with companies that have shown me some stuff 0:23:51 that I think are going to blow people’s minds, right? 0:23:56 But I know what’s coming intuitively and even seen some of it. 0:24:03 And I’m like, for anybody to think that we’ve hit the end of the road with AI is delusional. 0:24:06 I don’t know the right word, but we’re not at the end. 0:24:10 I mean, like we’re just starting to touch what like world models can do. 0:24:13 Like, you know, modeling all of the world and environment 0:24:16 and having that information inside of the AI as well 0:24:19 so that it works better with like embodied AIs and things like that. 0:24:20 Like we’re closer to the beginning. 0:24:25 We’re closer to the like the one yard line than we are to the 99 yard line. 0:24:27 Humans have a hard time extrapolating out. 0:24:31 Like, you know, you see progress and then I imagine what’s going to happen after, 0:24:33 you know, that technology built on it after two or three years, 0:24:35 how things are going to change. 0:24:37 People have a hard time like like imagining those things. 0:24:41 And now I remember when GPT one and two came out, like my friends in San Francisco, 0:24:44 like a lot of them were like XYC people, right? 0:24:48 And in our private group chats, they were sharing results from GPT one and two. 0:24:50 And like this changes everything. 0:24:52 And I’m like, maybe they’re more intelligent than me. 0:24:54 I feel like they got it like slightly faster than I did. 0:24:57 But like once I like I got it, I was like, oh, yeah, that does like AI actually works now. 0:25:00 Like, even though it’s not great yet, this is the beginning of it. 0:25:03 And so those same people who recognize that they’re more optimistic 0:25:05 than ever before right now. 0:25:08 And so I just those people over the people who probably at that time, 0:25:11 they were like, oh, AI is nothing or test BT is not good. 0:25:12 Yeah, they didn’t understand that kind of stuff was coming. 0:25:15 Like the people in the know, they’ve known for a while. 0:25:18 You know, even my previous startup bind did we did computer vision stuff. 0:25:22 And there was breakthroughs happening in AI then before LLMs even, right? 0:25:24 Especially on the computer vision side. 0:25:26 Like there was dramatic changes happening 0:25:29 where you could start recognizing the elements of images and stuff. 0:25:33 And so like AI has been improving for a long time now. 0:25:36 And before that, you know, machine learning applied to like recommendation engines 0:25:37 for Amazon and YouTube. 0:25:41 Like AI has been a long path and a lot of people don’t realize that 0:25:43 that this hasn’t happened overnight. 0:25:44 Yeah, right. 0:25:47 And then yeah, like you said, we’re like at the beginning of an exponential curve. 0:25:48 We’re not plateauing. 0:25:51 We’re literally like right here and about to go up. 0:25:53 So exactly. 0:25:55 Now when it comes to the topic of AGI though, 0:25:59 I think the thing that I struggle with the most around that conversation, right? 0:26:01 You got Sam Altman saying 2025. 0:26:07 You got Dario from Anthropix saying 26 or 27, if not later than that. 0:26:12 The problem I have, how do we know when we’ve hit AGI? 0:26:17 Because I feel like maybe Sam and Dario could have different definitions of it 0:26:20 and Sam might actually think we hit it in 2025, 0:26:22 but he didn’t hit Dario’s definition of it. 0:26:27 And I know Google has their whole like Google or open AI or maybe they both have it, 0:26:31 but they have like these levels of like various AGI’s, right? 0:26:33 Nobody really knows where the goalpost is right now. 0:26:37 And I think Sam might have this idea of this is what AGI is to me. 0:26:39 And I think we’ll hit it in 2025. 0:26:43 While other people’s definition of AGI might be different than Sam’s. 0:26:45 And when Sam thinks we hit AGI, 0:26:49 other people in the space will be like, yeah, but that’s just Sam’s version of AGI. 0:26:50 It’s not really AGI yet. 0:26:53 Yeah, maybe we should segue into like some 2025 predictions around all this. 0:26:56 But you know, one thing that’s interesting is, 0:27:00 I’m not sure if you remember this, but apparently open the eyes deal with Microsoft 0:27:03 has all these things saying that when they hit AGI, 0:27:06 it’s a deal off or like the ownership’s like something changes. 0:27:07 The structure changes. 0:27:11 I think they still work together, but the structure dramatically changes somehow. 0:27:13 Yeah, yeah, in opening eyes favor, right? 0:27:16 Like Microsoft has way less control of the company once that happens. 0:27:19 And so it might be in their benefit to say they’ve hit AGI. 0:27:23 My understanding about AGI is everyone has a different definition for this, 0:27:26 but like as soon as it can do the work of like a typical person, 0:27:28 not like the most genius person in the world, 0:27:31 but like an average, you know, a person who like sits at your desk 0:27:34 and does emails and stuff like that, like an admin or something. 0:27:39 For me, as soon as it does some kind of work like that, that’s like basic AGI. 0:27:44 Yeah, so Dario actually, I think described how he saw it on his episode with Lex. 0:27:47 I don’t have the exact quote in front of me right now, 0:27:54 but essentially he said when sort of every topic AI is able to understand it at like a PhD level, right? 0:27:59 So it almost sounds like maybe your definition and maybe Sam’s definition might be 0:28:01 it could do anything a human can do, 0:28:04 but it sounded like Dario’s definition is like it could do anything 0:28:07 that the smartest human at a specific task can do. 0:28:12 Yeah, yeah, that’s, you know, arguing over the definition. 0:28:14 So that’s going to continue to happen. 0:28:21 That’s where I sort of get like with Sam saying 2025 and Dario saying 26 or 27 0:28:22 and everybody having these different definitions. 0:28:26 It’s like, I feel like you have to have some sort of like, 0:28:28 okay, this is how we know we’ve hit it. 0:28:30 Otherwise, this debate is going to rage on forever. 0:28:32 Yeah, yeah, your robots are ruling the entire world. 0:28:34 They’re like, have we hit AGI yet? 0:28:39 While we said back in the robot services and everything, you know, it’s going to be like that. 0:28:45 I think I think in 2025, we’re going to get AI agents to actually work. 0:28:47 They actually go off and do work for you. 0:28:50 And the fact that these things also have an internal monologue going on. 0:28:54 I mean, for me, that’s AGI like it passes a Turing test. 0:28:58 If you didn’t know about chat, you could chat with it and think you were talking to a human. 0:29:00 You could internal monologue. 0:29:01 It can go off and do work for you. 0:29:04 I’m convinced all these things are going to be there 2025. 0:29:05 I mean, two of them already are there. 0:29:06 And so for me, that’s AGI. 0:29:10 Yeah, I think AGI 2025 and then and the interesting thing too, 0:29:15 as Sam Altman said, artificial superintelligence in the next thousand days or so. 0:29:19 You know, so that’s possibly our official superintelligence within three to five years. 0:29:24 Our official superintelligence is basically AI beyond humans understanding, 0:29:28 beyond anything the smartest human in the world could possibly do. 0:29:30 You know, make Einstein look dumb, right? 0:29:30 I don’t know. 0:29:36 I feel like once you hit AGI, ASI is not too far afterwards, right? 0:29:39 Because so if I’m basing it off of like Dario’s definition, right? 0:29:43 And if Dario’s definition is like an AI that knows every topic, 0:29:46 as well as the smartest person on that topic, right? 0:29:52 If we see AGI is like that, well, then wouldn’t that mean that AGI would be smart 0:29:56 enough to figure out how to develop an ASI, right? 0:29:59 Like if it is like the smartest coder in the world, 0:30:04 the smartest engineer in the world, you know, the smartest writer in the world, 0:30:09 all of this like bucket of things that you would need to create ASI. 0:30:12 If it was the smartest at every one of those things in the world, 0:30:17 it doesn’t seem too much of a stretch that AGI, as soon as we hit that, 0:30:20 ASI comes pretty damn quickly after that. 0:30:23 Yeah, I mean, that’s where like three years makes sense to me. 0:30:27 Like if you keep adding 10 to 20 IQ points to the thing every year, 0:30:30 you get smarter than any human very quickly, right? 0:30:31 Yeah, yeah, yeah. 0:30:36 I mean, like right now, is there any AI that can solve math problems 0:30:38 that no human has managed to solve yet? 0:30:42 No, but what is interesting is like I said in like one of our last episodes, 0:30:44 like scientists are already using this stuff now, though, 0:30:45 and saying like it replaced it. 0:30:47 So it’s not replacing the smartest human, 0:30:50 but it’s already replacing like smart graduates. 0:30:53 Yeah, well, I mean, AlphaFold, right? AlphaFold3, 0:30:56 they just open source AlphaFold3 so anybody can go use it. 0:31:00 But that’s like, you know, figuring out new ways to fold proteins and stuff 0:31:03 that are novel ways that humans haven’t figured out yet. 0:31:07 So if AI is figuring out these new novel things 0:31:10 that no human has figured out yet, once we get to AGI, 0:31:14 I just don’t see how ASI is not like fairly soon after. 0:31:16 Yeah, I’m beating that dead horse now. 0:31:17 Yeah, but you bring up AlphaFold. 0:31:20 I mean, that’s even another reason why I’m super optimistic 0:31:22 about the Trump administration coming in, 0:31:26 because people don’t realize like we’re in like the craziest time in human history. 0:31:28 It’s like, I don’t believe in the simulation theory, 0:31:31 but if you did, I understand why because we’re alive 0:31:34 in the most interesting possible time in humanity, right? 0:31:37 Like we were at the birth of artificial general intelligence. 0:31:41 We’re almost at the birth of like a new entity or a new kind of being. 0:31:44 And the next four years, all of these things are going to combine, right? 0:31:48 Like, you know, AI, robotics, we’ll be applying it to solving cancer 0:31:49 and all these different things. 0:31:53 And so in that area, you do want to be moving as quickly as possible. 0:31:56 You don’t want too many restrictions because we’re probably going to be like, 0:32:01 we’re totally reshaping what America is like entirely, right? 0:32:05 Yeah, no, it’s fascinating. I mean, you and I, we’re roughly the same age, right? 0:32:11 So like we lived in an era where we knew what it was like before internet and after internet. 0:32:15 We lived in the era where we knew what it was like before everybody had a cell phone 0:32:17 and after everybody has a cell phone. 0:32:22 We lived in an era when we had cell phones and an era where we had smartphones. 0:32:28 Like we’ve lived in this window of time where we’ve seen the most insane 0:32:33 advancements in technology like the world has ever seen in human history. 0:32:37 And we’re going to get to witness AGI and most likely ASI in our lifetimes. 0:32:39 Like that to me is mind blowing. 0:32:41 It is. Yeah. I grew up on IRC, right? 0:32:45 Like I grew up like when the internet was, was being born, right? 0:32:47 Like people didn’t know how to use the internet or what it was. 0:32:48 Or it was just, it was crazy. 0:32:52 It was just all this stuff like tied together barely somehow and all work somehow. 0:32:54 And then now everyone takes the internet for granted. 0:32:56 And it’s like a part of daily life. 0:32:58 It’s part of the whole world economy. 0:33:01 And then now we got people talking like, oh, is AI going to be a real thing? 0:33:02 It’s important or not? 0:33:04 It’s like, it’s going to look the same as everyone’s saying. 0:33:06 Like, is the internet going to be important or not? 0:33:07 Yes, it’s going to be important. 0:33:11 And most likely asked me way more important than the internet even because it’s intelligence. 0:33:13 It’s not just networking. 0:33:15 It’s intelligence. Absolutely. 0:33:16 Yeah. Yeah. 0:33:20 I know we’re going to do like probably a whole like 2025 predictions episode. 0:33:24 I imagine we’ll probably do an episode like that, like closer to New Year’s or something 0:33:26 where we just throw out like all of our predictions. 0:33:30 But I know when it comes to AGI, I think I probably live somewhere 0:33:35 between Sam Altman and Dario, like maybe 2026 and maybe not by 2025. 0:33:41 Maybe there’s like some version that’s like really, really close in 2025 0:33:44 and like open AI calls it an AGI. 0:33:45 But I don’t know. 0:33:47 Like I’m always bad at those predictions too. 0:33:51 Like every time I make a prediction like that, it always happens faster than I assume. 0:33:56 So me saying 2026 is probably a good bet for 2025. 0:33:58 So I’ll say I’m pretty bad at predictions. 0:34:00 They’re like, I’m great at prediction. 0:34:01 The kind of things that will actually happen. 0:34:04 Like I’ve done that so many times in my career that people have been kind of like shocked. 0:34:06 But like I’m always off on the timing. 0:34:09 So I don’t know. Like, yeah, I think 2025, but that’s based on my definition of like, 0:34:11 look, the things got internal monologue. 0:34:14 If they say that’s going to keep improving the things talking to itself 0:34:16 and like reflecting on what it’s worth. 0:34:18 I mean, that’s crazy. 0:34:22 And it sounds like they’re pretty confident that they’re going to have the agency work, 0:34:25 which probably means like, okay, you’re going to have like basically an AI assistant 0:34:29 that can handle your emails for you and stuff like that and schedule meetings and other things. 0:34:30 That’s probably all coming in 2025. 0:34:32 So for me, that’s like, yeah, that’s the best. 0:34:37 That’s something I can fairly confidently agree with that 2025 is going to be the year of the agents 0:34:41 that you just give it a prompt of what you want it to take care of for you. 0:34:45 And it’s going to go and use all the tools and do all the navigating and all the research 0:34:46 and handle that stuff for you. 0:34:48 We’ve already seen glimpses of it. 0:34:52 The glimpses we’ve seen are buggy, but like we can see where it’s going. 0:34:55 Microsoft just released magentaic or something like that. 0:34:57 You got the cloud computer use. 0:35:01 There’s been some news articles going around that chat GPT has that capability. 0:35:05 They just haven’t rolled it out to like the consumer models yet. 0:35:10 So like, I think pretty confidently we’re going to have really solid agents in 2025. 0:35:16 I also think the whole like digital twin like modeling the world stuff is going to be really big in 2025. 0:35:21 I think we’re going to see the video models like Sora where they’re actually doing that whole like world 0:35:25 modeling thing and sort of like understand environments and it’s generating based on 0:35:31 what it knows about environments versus like trying to recreate exactly what it was trained on. 0:35:36 I’m not very good at explaining that, but like that whole digital twin like world modeling thing 0:35:42 I think is going to only become like a bigger component of a lot of these AI models next year as well. 0:35:47 Yeah. Yeah. I think probably 20, 27 rich people are going to have robots. 0:35:50 You know, 20, 30 most people have robots. 0:35:52 That’s like kind of generally what I think. 0:35:55 I can see that. I could buy that timeline. Yeah. Yeah. For sure. For sure. 0:35:58 I mean, the biggest thing for consumers is just getting those costs down, right? 0:36:01 Like you’re not going to be getting people going and buying, you know, 0:36:04 $50,000 humanoid robots to do their laundry for them. 0:36:07 Yeah. It’s not going to be great at it either at first. 0:36:10 Right. If we’re like, like I can kind of do it, but didn’t fold it properly. 0:36:13 You know, my wife’s super picky about folding stuff. 0:36:16 So she’ll be like, no, no, no way. 0:36:17 Yeah. Yeah. Well, cool. 0:36:20 I think we covered quite a bit of ground on this episode. 0:36:23 You know, we talked about what the current election means. 0:36:27 We talked about the vibes in San Francisco shifting. 0:36:32 We talked about when we might see AGI, what AGI means, a lot of ground covered. 0:36:37 I think, I think that’s probably a good place to call it a wrap on this one. 0:36:37 It was fun. 0:36:41 I think that’s a good stopping point if you tune into this episode and you enjoyed it. 0:36:43 We’ve got some amazing episodes coming up. 0:36:46 If you like hearing our discussions about the current state of AI, 0:36:50 you want to hear us have discussions with other leaders in the AI space 0:36:54 and the people that are building these next AI generations. 0:36:57 Make sure you’re subscribed to the show, subscribe on YouTube, 0:37:00 subscribe on Apple Podcasts, Spotify, wherever you listen to podcasts. 0:37:02 We’re probably there. 0:37:05 We really, really appreciate you subscribing and thank you so much 0:37:07 for tuning into this episode. 0:37:08 We’ll hopefully see you in the next one. 0:37:22 [Music] 0:37:25 (upbeat music) 0:37:27 you
Episode 34: How will the 2024 election impact AI advancements by 2025? Matt Wolfe (https://x.com/mreflow) and Nathan Lands (https://x.com/NathanLands) dive into the implications of the upcoming Trump term.
In this episode, Matt and Nathan discuss the potential AI developments over the next few years, how different political outcomes could shape AI progress, and the shifting landscape in Silicon Valley. They explore the latest in AI models like OpenAI 01, the debate over AGI timelines, and how regulatory approaches might impact America’s competitive edge in tech.
Check out The Next Wave YouTube Channel if you want to see Matt and Nathan on screen: https://lnk.to/thenextwavepd
—
Show Notes:
(00:00) This won’t become political; mood is changing.
(04:55) Reduce government size, invest severance in tech.
(08:28) AI can reveal corruption in government spending.
(10:58) AI regulation may favor big companies, hinder startups.
(14:50) Sam Altman expects AGI by 2025, despite skepticism.
(17:59) AGI expected around 2025-2027, training slowing.
(19:56) AI models don’t learn in real-time conversations.
(22:48) Humans struggle to foresee technological advancements’ impact.
(27:53) AGI leads to ASI due to intelligence.
(29:39) Optimistic about AI and future advancements.
(32:20) Predicts accurately, but often wrong on timing.
The Next Wave is a HubSpot Original Podcast // Brought to you by The HubSpot Podcast Network // Production by Darren Clarke // Editing by Ezra Bakker Trupiano
Ranking 22 Of The Most Popular Ai Tools (Q4 2024 Ai Tier List)
AI transcript
0:00:04 – Model agnostic, ship fast, solve a specific problem. 0:00:06 When people are making the AI startup, 0:00:08 they should be looking at cursor and perplexity. 0:00:09 – It’s pretty damn impressive. 0:00:11 Where would you put that one? 0:00:12 – S tier. 0:00:13 – That’s one of the few that we can totally agree, 0:00:16 like S tier, that’s one of our main go-tos. 0:00:19 One of the most useful AI tools available out there 0:00:20 right now in my opinion. 0:00:23 (upbeat music) 0:00:25 Hey, welcome back to the Next Wave Podcast. 0:00:27 I’m Matt Wolf, I’m here with Nathan Lanz. 0:00:31 And today we’re gonna do a super comprehensive 0:00:33 AI tool breakdown. 0:00:36 In fact, we’re gonna do an AI tools tier list 0:00:39 where Nathan and I breakdown kind of all the AI tools 0:00:42 that we’ve used in the past are using right now, 0:00:44 the most talked about AI tools. 0:00:47 And we’re gonna put them on a tier list from S to F 0:00:50 and really rank where we would put them. 0:00:53 And also talk a little bit about how we’re using each one 0:00:57 and give a little context as to why we’re ranking them 0:00:58 that way. 0:00:59 So you’re probably gonna listen to this. 0:01:01 You’re probably gonna completely disagree 0:01:02 with some of our rankings. 0:01:03 You’re probably gonna be frustrated 0:01:05 that we excluded some of those tools. 0:01:07 Let us know all that stuff in the comments. 0:01:07 We wanna hear it. 0:01:09 We wanna know where you would rank them. 0:01:11 We wanna know what tools we’re missing. 0:01:12 And if you liked this episode, 0:01:13 we’ll probably do more like it. 0:01:15 So let us know that as well. 0:01:17 But without any further ado, 0:01:20 let’s go ahead and just break down this tool list here. 0:01:22 (upbeat music) 0:01:25 – When all your marketing team does is put out fires, 0:01:26 they burn out. 0:01:27 But with HubSpot, 0:01:30 they can achieve their best results without the stress. 0:01:33 Tap into HubSpot’s collection of AI tools, 0:01:36 Breeze to pinpoint leads, capture attention, 0:01:39 and access all your data in one place. 0:01:40 Keep your marketers cool 0:01:42 and your campaign results hotter than ever. 0:01:46 Visit hubspot.com/marketers to learn more. 0:01:48 (upbeat music) 0:01:53 – So if you’re listening to like the audio version 0:01:54 of this podcast, 0:01:57 this might be one that you might appreciate 0:01:59 the video version if you actually wanna see the tier list. 0:02:02 We’ll do our best to talk about the tool 0:02:05 and tell you where we’re putting it on the tier list. 0:02:07 But this is sort of a visual episode. 0:02:08 I’m gonna start all the way on the left. 0:02:11 Let’s start with Grock here. 0:02:14 Where would you put Grock/XAI? 0:02:18 – Yeah, I mean, I hate to not give it like a super high rank 0:02:19 ’cause like obviously I’m a fan of Elon Musk. 0:02:22 I think I’m one of the few people that I know 0:02:25 who thinks that long-term that Grock and XAI 0:02:27 is going to be the main competitor to OpenAI. 0:02:29 Like I think they’re gonna be bigger than Anthropic 0:02:32 is my current opinion, which seems to be controversial. 0:02:34 But right now, I would probably put it as a B. 0:02:35 It keeps getting better. 0:02:38 Like the version two is a lot better 0:02:39 than most people realize. 0:02:42 It’s not as good as ChatGBT or Claude, 0:02:44 but it is nice that you can talk to it 0:02:45 about pretty much anything you want to. 0:02:49 It is less censored and the image generation is pretty good. 0:02:51 The image generation is a lot more free 0:02:55 to make whatever crazy meme you want or anything. 0:02:58 So I think Grock is like a solid B right now. 0:03:01 I think long-term it could be an S. 0:03:05 – Yeah, so for me, just based on like how often I use Grock, 0:03:07 I’d probably put it like down in D. 0:03:10 Like that’s not me saying it’s a bad product. 0:03:11 – Yeah. 0:03:13 – But for me, it’s saying that like, 0:03:16 all right, if I need to go use a Chatbot right now 0:03:18 to get an answer to something, 0:03:20 Grock isn’t gonna be my first choice. 0:03:23 It’s not even gonna be like my top three, right? 0:03:25 Like ChatGBT, Claude, perplexity, 0:03:26 you’re probably gonna be my top three. 0:03:29 Grock would probably fall number four for me 0:03:31 for actually going and using it 0:03:33 to ask questions right now. 0:03:35 – One thing that I’ve noticed that’s kind of surprised me 0:03:37 is I thought it would be a little bit better 0:03:39 at getting real-time data from X somehow. 0:03:40 – Yeah. 0:03:41 – And I don’t think they really nailed that yet. 0:03:43 So that’s one thing that kind of sucks is like, 0:03:44 I went there several times like thinking like, 0:03:47 oh, I can ask it some question about an X user 0:03:50 about my own profile or whatever for some kind of insights. 0:03:53 And it kind of is not that great at it. 0:03:56 So I don’t know, maybe a C, maybe a green C. 0:03:59 – I don’t think I’m gonna keep it in D for myself 0:04:01 because I think that like the biggest redeeming factor 0:04:06 of Grock for me is that it uses the Flux 1.1 Pro model. 0:04:09 I think it’s Flux 1.1. 0:04:12 But, you know, Flux has three models, right? 0:04:13 The AI image generator Flux. 0:04:17 It’s got Flux Snell, it’s got Flux Dev, 0:04:19 and it’s got Flux Pro. 0:04:22 Flux Snell, I believe, is like the free version 0:04:23 that you can use. 0:04:25 Flux Dev is a little bit better model 0:04:28 that you can use with like the API 0:04:31 and it’s improved a little bit above the Snell model. 0:04:33 And then you’ve got the Flux Pro model, 0:04:36 which is actually kind of expensive to use, 0:04:39 but it creates really, really good AI images. 0:04:44 And Grock is actually using the Flux Pro model in there. 0:04:47 So by using Grock, you actually get access 0:04:50 to like one of the better AI image-generated models 0:04:51 is out there. 0:04:55 So for me, I think I’d probably solidly put it as a C, 0:04:57 just because it’s not my main go-to, 0:05:00 but it’s got a great AI image generator in it. 0:05:02 – I do see a lot of memes and stuff on X 0:05:04 that are definitely being generated with Grock. 0:05:06 – Oh yeah. 0:05:09 Anytime you see like an image that has like blood 0:05:12 or like an actual, like famous person in it, 0:05:13 like a lot of the Kamala and Donald Trump 0:05:15 and Elon Musk memes that you’ve seen, 0:05:18 most likely created with Flux, 0:05:20 most likely done inside of Grock. 0:05:23 All right, so next up, let’s go chat GPT. 0:05:26 Where do you, where would you put chat GPT? 0:05:27 – Yes. 0:05:29 – You put chat GPT as S. 0:05:32 – Like I still use chat GPT more than anything else. 0:05:34 – So you’re going to chat GPT more than Claude, 0:05:35 more than Perplexity? 0:05:36 – Yes, yeah, I still am. 0:05:39 At first some of, yeah, I felt like Claude was getting better 0:05:41 in terms of like the projects were better 0:05:43 than custom GPTs and it still is slightly, 0:05:46 but like OpenAI’s already caught up there too, 0:05:49 where like there’s more, like you can put multiple files 0:05:51 into the custom GPT and things like that. 0:05:55 So that’s, I use it daily like for translating with my wife. 0:05:59 I use the voice daily now. 0:06:00 So I use it that way. 0:06:02 I still have a custom GPT that I use, 0:06:06 that I created for helping me write threads on X. 0:06:08 I’ve been using that daily. 0:06:11 And also I have a thing that I use for tracking my, 0:06:13 the calories I eat and the protein I intake every day 0:06:14 for working out. 0:06:15 – Oh yeah. 0:06:16 – I use that every day. 0:06:18 So like, and like I’m sure I could do that stuff in Claude too, 0:06:20 but I’m just finding like the combination of features 0:06:24 that it has, it still has just, it’s my main go-to. 0:06:28 – So for me, it’s not my, it’s not my first go-to. 0:06:30 I’ll probably agree and leave it in S tier 0:06:33 because I do think that chat GPT is like 0:06:36 constantly setting the bar for everything else, right? 0:06:39 Like I feel like I probably go to Claude in perplexity 0:06:42 a little bit more than I go to chat GPT. 0:06:45 But, you know, they’ve got their 01 model, 0:06:48 which is the best reasoning logic model 0:06:51 that exists right now, according to pretty much 0:06:55 any sort of benchmark you look at GPT 4.0 0:06:57 is kind of leading the pack. 0:07:01 You know, almost anytime you see a new model come out, 0:07:04 you know, Google Gemini or Claude model or Lama model 0:07:07 or any model really, who do they compare it to? 0:07:10 They’re always like, oh, it’s almost as good as GPT 4. 0:07:13 It’s right up there on par with the benchmarks of GPT 4. 0:07:16 So I almost feel like you gotta leave it in S tier 0:07:19 just for the reasons that it is like setting the bar 0:07:22 that every other company is trying to get to. 0:07:24 – Yeah. And, you know, I mean, 0:07:26 we’re talking specifically about chat to be T here, 0:07:27 but like if we were talking about OpenAI as a company, 0:07:31 definitely S because like it sounds like they have more 0:07:33 stuff in the works that, you know, with the agents, 0:07:35 I think they’re ahead in agents. 0:07:36 And like you said, with the reasoning model, 0:07:38 they’re definitely ahead there. 0:07:40 And I might not use that every day, 0:07:43 but like I am seeing like tweets from like real scientists 0:07:45 saying like, I’m using this every day now. 0:07:49 And holy crap, if 01 is better as people are saying, 0:07:51 they’re super pumped about it. 0:07:53 And it’s like what one scientist said that it’s already, 0:07:56 01 preview that a paper he was working on, 0:07:58 it helped him get it done in two days, 0:08:00 what normally would have taken him a month. 0:08:02 And so yeah, I would definitely put them as tier. 0:08:03 – Cool. Yeah. I’m not going to, 0:08:05 I’m not going to fight for that one too hard. 0:08:08 It’s not my first go to, but all right. 0:08:11 So next up is Claude for me. 0:08:13 I put Claude in S tier because it’s probably the one 0:08:14 I go to the most. 0:08:18 I have tons of custom projects that I’ve built out in Claude. 0:08:20 You know, like I’ve got a project that helps me write scripts. 0:08:22 I’ve got projects that help me with my notes 0:08:24 for making YouTube videos. 0:08:27 I’ve got all of these various projects that I’ve made. 0:08:28 So when I need to make a script, 0:08:31 I can literally drag and drop an article in there. 0:08:33 And it’s sort of rough draft script for me, 0:08:37 or I can pull in all of the URLs from like a YouTube video 0:08:38 that I’m putting in. 0:08:40 And it creates like a list of resource links for me, 0:08:43 really clean and easy that I use in my description. 0:08:48 So like Claude is probably my second most used tool 0:08:51 behind perplexity. 0:08:53 So for me, that one’s S tier. 0:08:55 And I also really, really love the artifacts, right? 0:08:58 Like I love that I can have it generate code. 0:09:00 And then it will actually show me what that code looks like 0:09:02 over on the right sidebar. 0:09:05 That to me is like really, really valuable. 0:09:07 So for me, Claude is an S tier. 0:09:08 – Yeah, I mean, I can agree on S tier. 0:09:11 I guess if it was my own tier list, I’d probably put it A. 0:09:13 I’m just, I’ve always felt weird about the fact 0:09:15 that it’s like basically an employee from OpenAI 0:09:16 who left and tried to do his own thing. 0:09:19 And like, and for the first year, 0:09:21 it was like basically all copies of JVT. 0:09:23 So like, it’s like, there’s just a weird feeling there. 0:09:25 And like, and it’s called Claude. 0:09:26 I don’t know, I hate the name. 0:09:31 And also, I’ve noticed recently, like it, like, 0:09:32 people said it was better at writing. 0:09:34 I’m just not seeing that lately. 0:09:36 Like I see it’s like pretty much the same as Chatch PT. 0:09:39 Maybe on code, it’s better, you know, it seems to be. 0:09:41 I think that’ll change quickly. 0:09:43 Artifacts, definitely give that to them. 0:09:44 Like artifacts is amazing. 0:09:47 Like that’s like the one invention that they’ve created 0:09:49 that’s like actually really, really cool is artifacts. 0:09:52 – Yeah, well, they also showed off the computer use 0:09:55 recently as well, where you can basically give it 0:09:57 instructions and it will actually click around 0:09:58 on your screen and stuff. 0:10:00 – Yeah, which no one seems to be using that. 0:10:02 And my understanding from like hearing, you know, 0:10:04 from friends is like that basically OpenAI 0:10:06 already had something like that, like out for a while. 0:10:09 And it’s just like, I think Claude tried to beat them to it. 0:10:11 But then like, they’re also probably behind there. 0:10:14 And it’s like, yeah, it’s just not useful yet. 0:10:15 – So what do you think? 0:10:16 Do you think it should stay in S tier? 0:10:19 Or do you think it should bump down to eight here? 0:10:22 Cause it’s constantly playing catch up. 0:10:24 – I mean, I do think it’s eight here. 0:10:25 I really do. 0:10:28 And the other thing too, I’ve noticed is people used to say 0:10:31 that, you know, chat to PT was super woke 0:10:33 and it was super left-wing and maybe just slightly, 0:10:35 but I feel like they’ve gotten a lot better at that. 0:10:36 Like where you can talk to it. 0:10:40 Like, like I had it edit a newsletter issue 0:10:42 that I put out the other day that had some talk 0:10:44 about the Trump election that just happened, 0:10:46 like Trump winning and, you know, 0:10:48 and I understand that’s controversial people. 0:10:49 Some people hate Trump, 0:10:51 but I thought that like why it’s good for AI. 0:10:55 Claude refused to edit the newsletter issue, right? 0:10:57 I was like, that sucks. 0:10:59 And then I went to chat to PT and it not only did I edit it, 0:11:00 but I felt like it did it as good of a job 0:11:02 as Claude would have or better. 0:11:05 And so the fact that it’s like that you can’t, 0:11:07 that there’s certain things that just won’t help you with, 0:11:09 like long-term if they keep doing that, 0:11:10 it’s like, I don’t know how I can rely on it 0:11:15 as like my main assistant for my life if they do that. 0:11:17 – So to add to what you’re saying 0:11:20 and to actually sort of agree with bumping it down to eight 0:11:21 here, I do use it a lot. 0:11:23 It is one of my more favorite tools, 0:11:24 but here’s like another argument 0:11:26 for pushing it down to eight here. 0:11:29 They jumped to bump the price on Haiku, right? 0:11:31 Because they said, well, it’s smarter. 0:11:33 So we should cost more, right? 0:11:34 So they bumped the price on Haiku. 0:11:36 And the other thing about Claude, 0:11:38 like the biggest complaint I hear the most 0:11:40 is getting rate limited. 0:11:42 I don’t really hear people talking about how like 0:11:45 in chat, GPT, they get to a rate limit very often. 0:11:47 But in Claude, everybody seems to hit the rate limits. 0:11:50 I don’t really find I hit the rate limits very often 0:11:51 unless I’m writing code. 0:11:53 If I’m just like having conversations 0:11:55 about like a YouTube script or something like that, 0:11:57 I never run into rate limits. 0:11:59 But I do know that’s like a big complaint people have 0:12:00 is that for whatever reason, 0:12:03 they run into rate limits a lot with Claude. 0:12:04 – Yeah, I mean, I think, I mean, 0:12:07 they have to be more cash strapped than opening eye for sure. 0:12:10 Like no one talks about that, but they have to be like, 0:12:12 like if you’re the number two player in a space, 0:12:14 you’re always going to have a way harder time 0:12:16 raising capital. 0:12:17 – Right. 0:12:19 – But they also have the backing of Amazon 0:12:20 and Google, right? 0:12:22 Like Google is like huge investor in Claude 0:12:24 and Amazon basically said, 0:12:28 we’re going to use Claude as the future Alexa, right? 0:12:30 So it’s like, they do have that Amazon backing. 0:12:32 Like Amazon could be to anthropic 0:12:35 what Microsoft is to chat GPT, you know? 0:12:36 – Yeah. 0:12:37 – True, true, true. 0:12:38 – So we’ll see how that plays out. 0:12:40 But I think you’ve talked to me 0:12:43 into sort of conceding it down to an A. 0:12:45 All right, so next up is Gemini. 0:12:48 The thing about Gemini is we have Gemini 0:12:49 and we have notebook LM on the list 0:12:51 and notebook LM is powered by Gemini. 0:12:55 So if we’re just looking at like Gemini on its own, 0:12:57 like ignore the fact that notebook LM is powered 0:13:00 by Gemini, we’ll rank that one separately. 0:13:02 Think about like the Gemini chat 0:13:07 and like maybe let’s include like Google AI search, right? 0:13:10 Like the AI overviews that Google does, 0:13:12 we’ll consider that part of Gemini as well. 0:13:15 So considering the Gemini chat 0:13:17 and considering Google’s AI overviews, 0:13:19 where would you put Gemini? 0:13:20 – F tier. 0:13:21 – You put Gemini and F tier. 0:13:23 (laughing) 0:13:25 – I mean, I feel bad. 0:13:27 Like Logan is gonna hate me for saying that. 0:13:31 And I will concede, I have not used it recently 0:13:33 and I know everyone said it’s gotten a lot better 0:13:34 and so I should use it again. 0:13:37 And I just, I feel like, I haven’t seen a good, 0:13:39 like notebook LM is a great product. 0:13:41 So like that’s actually the first great product 0:13:43 I’ve seen from Google in a long time. 0:13:45 I feel like the Gemini release, 0:13:47 I feel like it’s been like a totally botched launch 0:13:49 and I don’t know anyone who personally uses it. 0:13:51 So that’s why I put it, put it F. 0:13:52 (laughing) 0:13:54 – I’m in the boat of like, 0:13:57 I hate to put anything in the F tier. 0:13:59 Like, ’cause you know, 0:14:03 I actually like a lot of people at Google. 0:14:06 I, the thing about Gemini that there are redeeming factors 0:14:06 about Gemini, right? 0:14:09 Like Gemini’s got the largest context window 0:14:11 of any platform, right? 0:14:13 So if you want to upload the entire Lord of the Rings 0:14:16 book series or the entire Game of Thrones book series 0:14:18 and have discussions about it, 0:14:20 Gemini is pretty much the only game in town 0:14:24 that’s gonna let you upload that large of a context window. 0:14:26 So I think that’s sort of a redeeming factor 0:14:27 for them as well. 0:14:31 Gemini, I believe also has the ability 0:14:33 to sort of read videos now too, right? 0:14:35 So you can put a video in there 0:14:36 and it understands what’s going on in the video 0:14:40 and can tell you about what it sees inside of that video. 0:14:43 I don’t know if any other models can do that. 0:14:46 Gemini also powers notebook LM, 0:14:48 which, you know, we’re kind of trying to keep the two separate 0:14:52 at the moment, but like notebook LM is really, really useful. 0:14:55 So I have a hard time like throwing it as like a solid F, 0:14:57 but the only reason I would put it sort of lower 0:15:00 on the tier list is for the same reasons you mentioned, 0:15:02 it’s not really one of my go-tos. 0:15:03 It’s not something where I’m like, 0:15:05 okay, I need to go use AI for this. 0:15:07 Let’s go pull up Gemini. 0:15:09 No, I’m usually gonna go to Claude, ChatGPT or Perplexity, 0:15:10 right? – Yeah. 0:15:14 – So for me, it’s just not the top of mind tool yet 0:15:17 that I would go and use when I need AI’s help. 0:15:19 – Yeah, it’s got great capabilities. 0:15:21 Like if I was ranking it purely on capabilities, 0:15:23 I might put it A or B. 0:15:26 – And I’m not a huge fan of the AI overviews yet. 0:15:29 Like you mentioned, it kind of didn’t know the difference 0:15:31 between real news and like memes, right? 0:15:33 Like that’s why I was telling people to put glue on pizza 0:15:35 and eat 10 rocks a day or whatever, right? 0:15:37 Like it just, it didn’t know 0:15:39 that like this wasn’t factual information. 0:15:42 It was just information it found on the internet, right? 0:15:45 But I think they’ve improved a lot of that stuff as well. 0:15:46 But it’s gotten to a point now 0:15:49 where I don’t really use Google or see AI overviews too often 0:15:51 ’cause I go to Perplexity for that kind of stuff now. 0:15:53 So I don’t know, personally, 0:15:55 I have a hard time throwing it as an F 0:15:57 because it is a capable model. 0:16:00 It’s just not one of the models it’s like my go-to. 0:16:03 I think I’d probably maybe put it at like a C or a D. 0:16:06 – D, let’s do D, something has to be lower. 0:16:09 Like otherwise we’re gonna have like all ABs and Cs 0:16:10 – That’s true, that’s true. 0:16:11 And if I’m looking at the two, 0:16:15 I probably still would go to Grock before Gemini as well. 0:16:16 – Yeah, I would, yeah. 0:16:18 (upbeat music) 0:16:19 We’ll be right back. 0:16:21 But first I wanna tell you about another great podcast 0:16:22 you’re gonna wanna listen to. 0:16:26 It’s called Science of Scaling hosted by Mark Roberge 0:16:29 and it’s brought to you by the HubSpot Podcast Network, 0:16:32 the audio destination for business professionals. 0:16:34 Each week hosts Mark Roberge, 0:16:37 founding chief revenue officer at HubSpot, 0:16:39 senior lecturer at Harvard Business School 0:16:41 and co-founder of Stage Two Capital, 0:16:44 sits down with the most successful sales leaders in tech 0:16:47 to learn the secrets, strategies, and tactics 0:16:49 to scaling your company’s growth. 0:16:51 He recently did a great episode called 0:16:55 How Do You Sol For A Siloed Marketing in Sales? 0:16:57 And I personally learned a lot from it. 0:16:58 You’re gonna wanna check out the podcast, 0:17:02 listen to Science of Scaling wherever you get your podcasts. 0:17:05 (upbeat music) 0:17:06 – Let’s do perplexity next. 0:17:09 So perplexity for me, it’s all an S tier. 0:17:10 I use it every single day. 0:17:11 – S tier. 0:17:12 (laughing) 0:17:13 Yeah, yeah. 0:17:15 Like right now I’m like chat, speak to you in perplexity. 0:17:16 That’s like daily for me. 0:17:18 And they just, you know, 0:17:20 I mean, Ervin was one of our first guests who came on. 0:17:22 Like, you know, I’ve talked with him personally. 0:17:25 He’s like, he’s a great guy and they ship so fast. 0:17:26 You know, and like, 0:17:28 and coming from like Silicon Valley and doing startups, 0:17:31 I know like how fast you ship things is so important. 0:17:33 I would say they’re shipping faster than anyone. 0:17:35 Like they’re doing better than open AI in that regard. 0:17:37 – Well, and they’re more like Gnostic, right? 0:17:40 So like what you can like perplexity 0:17:42 powers part of future tools, right? 0:17:45 And on future tools, it uses the, 0:17:47 it uses llama with perplexity. 0:17:49 So it does a search on the web, 0:17:49 fives the information, 0:17:53 uses llama to sort of like write the summaries and stuff. 0:17:56 When I use perplexity like on my desktop, 0:17:57 not within the API, 0:18:00 but just like I’m gonna use it to look up some information. 0:18:03 I’m using it with Claude 3.5 sonnet. 0:18:04 They even added a new feature 0:18:09 where you can use the GPTO1 model or the open AI O1 model 0:18:13 and actually have it do more like logical thinking through 0:18:16 with what it found in its search results, right? 0:18:19 So like it actually allows you to use 0:18:22 like whatever model is your favorite 0:18:24 plus the web search results, you know? 0:18:26 – Yeah, that’s awesome. 0:18:28 – That’s one of the few that we can totally agree. 0:18:30 Like S tier, that’s one of our main go-tos. 0:18:33 One of the most useful AI tools available out there 0:18:34 right now in my opinion. 0:18:35 – That is interesting. 0:18:37 So it kind of has replaced not only Google 0:18:39 but Wikipedia as well in a way, 0:18:41 which they probably use Wikipedia as a source, 0:18:42 but that is interesting. 0:18:43 – Yeah, yeah, yeah. 0:18:45 And I’ve done a lot of like research. 0:18:48 Like the other day I was looking for a new like headset 0:18:50 with a microphone on it, right? 0:18:52 Like headphones with a microphone. 0:18:54 And I asked it to basically like look 0:18:57 for the most recommended headset microphones 0:19:01 by YouTubers and rank them like top 10. 0:19:03 And it went and looked for all of this 0:19:05 and it linked me up to the YouTube videos 0:19:07 and it ranked them for me and it gave me the source. 0:19:10 And then I went, okay, cool, this is how it ranked them. 0:19:12 Now I’m gonna start watching some of these YouTube videos. 0:19:13 And I started clicking into the videos 0:19:16 to see what the people were actually saying about them, 0:19:18 what the sound quality was like and all that kind of stuff. 0:19:20 So I was actually starting to like click 0:19:23 into all the resources after it did all the research for me. 0:19:25 And like, man, it’s just, it’s, for me, 0:19:27 it’s been really, really helpful. 0:19:30 – It’s probably the most useful AI tool. 0:19:33 – Cool, let’s do Lama next. 0:19:35 I’m curious your thoughts, 0:19:37 ’cause I don’t think you probably use Lama very often, but– 0:19:38 – Yeah, that’s what I would say. 0:19:39 I have a hard time ranking. 0:19:41 I’m gonna like trust your rank a little bit better than mine. 0:19:43 I’ve used it one time. 0:19:46 – So Lama, like I mentioned, Lama is actually the AI 0:19:49 that powers what’s going on at Future Tools. 0:19:51 So all the like descriptions and things like that 0:19:54 that are on the website are a combination of perplexity 0:19:56 and then Lama, right? 0:19:57 ‘Cause Lama is really kind of the cheapest model 0:19:59 that you can use right now. 0:20:00 – Okay. (laughs) 0:20:02 – ‘Cause it’s open source, right? 0:20:03 – Yeah, yeah. 0:20:06 – So like, I think the fact that it’s open source, 0:20:09 you get, it’s gotta have some like brownie points for that, 0:20:09 right? 0:20:11 Like that, you know, gives it some cred. 0:20:13 Because so many of the open source models 0:20:16 that people are using now, they started with Lama 0:20:19 and then sort of modified them, forked them, 0:20:20 whatever you wanna call it 0:20:21 and like built their own models, 0:20:24 but we’re based on Lama originally, right? 0:20:27 So I think that’s like an important factor. 0:20:30 I would probably put it into like B tier. 0:20:35 Like I would use Lama probably before I would use Grock 0:20:38 or like getting a text response 0:20:40 just because it is that open source model 0:20:43 and it is insanely inexpensive to use 0:20:45 as a large language model. 0:20:47 When I use my Meta Ray-Ban glasses, 0:20:48 they’ve got Lama built into them. 0:20:51 So if I rarely ask my Ray-Ban glasses questions, 0:20:54 but when I do, it’s answering with Lama. 0:20:55 For me, it’s gotten to a point 0:20:58 where like most of the large language models 0:21:02 are like as good as each other for my use cases, right? 0:21:05 Like a lot of people that use these large language models 0:21:08 that need them for like the logic reasons 0:21:13 or more like really in-depth scientific complex topics, 0:21:16 they’re probably a little bit more sensitive 0:21:18 to which model they use. 0:21:20 For me, if I’m just like asking questions 0:21:23 that’s like general knowledge that’s out in the world 0:21:26 or if I’m trying to get it to help me like write a story 0:21:29 or if I’m getting it to try to summarize something for me, 0:21:31 most of those use cases, 0:21:34 like all of the models do pretty well at this point. 0:21:36 And so the fact that we’ve got an open source version 0:21:40 that does what ChatGPT and what Claw does 0:21:43 and you know, what all these other models do 0:21:46 really decently for most people’s use cases, 0:21:48 that’s why I think there’s value there, right? 0:21:52 Like most people are using something like ChatGPT 0:21:53 to help them write their emails better 0:21:57 or to cheat on their social studies homework 0:22:01 or to do fairly simple things 0:22:03 that pretty much all the large language models 0:22:05 are pretty profession at now. 0:22:08 And so, you know, Claw was the open source version of that. 0:22:09 – Right. 0:22:11 Yeah, I mean, since I haven’t used it much, 0:22:12 I’m not gonna like argue on it. 0:22:14 I probably would have put it the same as GROC 0:22:16 if it was just my own ranking. 0:22:19 But since the fact that you’re actually using it 0:22:21 and like neither one of us daily use GROC 0:22:24 on a daily basis, like, yeah, it should be a B then. 0:22:26 – Let’s move on now to, 0:22:28 let’s talk about some of these like coding platforms. 0:22:31 So we got GitHub Co-Pilot, we got Hurster. 0:22:34 – Replet, AI agent would be another one. 0:22:35 – Yes, we got Replet. 0:22:38 – There’s another one that people 0:22:39 are probably gonna be angry that we’ve put on here. 0:22:41 I can’t remember the name of it. 0:22:41 – Is it Bolt? 0:22:42 Are you talking about Bolt? 0:22:44 – Oh, is that, yeah, it is Bolt, isn’t it? 0:22:45 – Bolt is pretty good. 0:22:46 I’ve only tried it for a couple of minutes. 0:22:48 So I don’t think we should put it on this tier list 0:22:51 because like I haven’t played with it enough myself. 0:22:51 – Yeah. 0:22:53 – But it was impressive. 0:22:54 Like I literally told it, like, 0:22:57 I was watching a video earlier today from Tiff and Tech, 0:22:59 who’s a YouTuber who does tech videos. 0:23:03 And she actually asked it to make me a clone of Spotify. 0:23:04 And it did. 0:23:07 And I was like, like, literally that was the prompt. 0:23:09 Go make me a clone of Spotify. 0:23:10 And like when I was using it, 0:23:13 I asked it to make me like an Evernote type tool 0:23:16 where I can sort of save bookmarks from around the web in it. 0:23:19 And it made it pretty quickly. 0:23:22 Like, it’s not like cursor where it looks like a IDE 0:23:24 where you’re, you know, you can see the code 0:23:25 and all that kind of stuff. 0:23:27 It looks like a chat bot. 0:23:28 You ask it to do something 0:23:30 and it just sort of runs through the code. 0:23:31 And it’s like, all right, here’s your app. 0:23:33 I don’t do a lot of code. 0:23:35 So I’m gonna have to lean on you a little bit 0:23:36 for these code ones. 0:23:38 – Yeah, and disclaimer, I don’t code every day. 0:23:40 Like, you know, I’ve done three tech startups. 0:23:42 I’ve coded on and off most of my life. 0:23:47 You know, GitHub Copilot, originally I loved it. 0:23:49 It’s actually how I got involved in AI 0:23:51 or at least more in like the current way of AI. 0:23:52 I’d actually been doing some stuff with AI 0:23:54 in my previous startup. 0:23:56 We’d actually had a computer vision department. 0:23:58 In this current way, that’s how I got interested. 0:24:00 It was like trying out GitHub Copilot 0:24:01 and seeing how amazing it was. 0:24:04 I was like, I was actually going back and like just for fun. 0:24:08 I was like, oh, I should like pick up C++ again. 0:24:11 And maybe I can make like a cool little game demo 0:24:14 like something I was doing like a Zelda kind of game 0:24:16 but it was kind of like Dark Souls theme 0:24:17 just like messing around. 0:24:20 And it was amazing how fast I was able to like learn C++ 0:24:22 again just using GitHub Copilot. 0:24:25 I was like, oh, ’cause I know basic like coding logic 0:24:25 and stuff and it was like, 0:24:28 it was just like helping teach me things so fast. 0:24:30 But I haven’t used it recently 0:24:32 and it seems like less people are using it. 0:24:35 It feels like maybe they fell behind a little bit 0:24:37 compared to cursor and other ones. 0:24:39 So I think I’d probably put it like in a solid, 0:24:43 I don’t know, B tier would be where I’d put it, yeah. 0:24:43 – Okay, yeah. 0:24:46 I mean, I’ve never used Copilot like not even once. 0:24:48 So like I really don’t have any sort of rebuttal 0:24:49 or argument to that. 0:24:51 So B tier it is. 0:24:52 – Yeah, I mean, it’s mainly really good 0:24:54 for like inline suggestions of like, 0:24:55 oh, you’re writing something 0:24:57 and this is what I think you’re wanting to write 0:24:59 and give you a great suggestion. 0:25:00 You press tab to complete it. 0:25:02 It’s like, that’s how they got started. 0:25:04 I think they have some other more advanced features 0:25:06 but I haven’t heard of anyone actually using them so. 0:25:08 – Yeah, yeah. 0:25:09 And then we have cursor, right? 0:25:12 And cursor, we actually did an episode with Riley Brown 0:25:15 where we used cursor and built a whole app 0:25:17 like within the course of that episode. 0:25:20 It’s pretty damn impressive. 0:25:22 Where would you put that one? 0:25:24 – So I mean, I would put cursor at S tier 0:25:26 ’cause I think it’s actually having a major impact 0:25:27 on software development. 0:25:29 Like it’s like, there’s lots of AI stuff 0:25:30 that we talk about where it’s like, 0:25:32 oh, it’s cool and it has promise in the future. 0:25:36 I’m hearing from lots of friends in Silicon Valley 0:25:39 that like lots of teams are using cursor now. 0:25:41 And it’s the fact of, yeah, 0:25:42 it can give you the inline suggestions 0:25:44 like GitHub co-pilot does, 0:25:47 but also it can like look at your entire code base 0:25:50 and it can have context of the entire code base. 0:25:52 It can, it can, it’s helping people a lot 0:25:54 for like debugging issues, you know? 0:25:56 ‘Cause like debugging issues is like one of the things 0:25:57 when you’re in software development, 0:25:58 it’s like the thing where you’re like, 0:26:00 damn, why am I doing this job? 0:26:03 Like you feel like the job is amazing 0:26:04 when you’re like, you’re having a concept 0:26:06 and you start, you know, you’re putting it together 0:26:08 and you get to see a demo of it work. 0:26:09 But as soon as you start debugging, 0:26:11 it’s just like, I hate this job. 0:26:14 – Yeah, yeah, I’ll keep it up there and S tier, I agree. 0:26:17 I, you know, my sort of main experience using cursor 0:26:20 has been watching Riley go back and forth with us 0:26:22 on an episode. 0:26:25 And within what, the course of 45 minutes we went from, 0:26:26 like we didn’t have an idea. 0:26:29 Like we used AI to come up with the idea on that episode. 0:26:31 And then by the end of the episode, we had a working app. 0:26:35 And I was like, this is a 45 hour long episode 0:26:37 and we have software now. 0:26:39 Like that is blowing my mind. 0:26:41 – Yeah, something else it’s gonna do that’s amazing 0:26:44 that I haven’t heard a lot of people talk about is you can, 0:26:45 like a lot of times when you’re coding, 0:26:47 you might be working with a certain API 0:26:48 or some kind of, you know, 0:26:50 some kind of library that you’re using 0:26:53 that you don’t fully know how to use it. 0:26:55 And you end up having to go back and back to the docs 0:26:56 over and over and over. 0:26:58 It’s like, oh, look, how do you do this? 0:26:59 How do you do that? 0:27:01 You can like just like link to docs 0:27:05 and you can like, like kind of like upload it to a cursor. 0:27:07 And like you could say, here’s whatever API. 0:27:09 And then you can just like at tag that 0:27:12 and talking to cursor and it knows all of the docs. 0:27:13 And then you can just ask it any question 0:27:15 and get the answer to how you do that. 0:27:16 – Oh yeah. 0:27:17 – That’s like so much better 0:27:19 than like go like opening up another window 0:27:20 and like trying to have to search through like, 0:27:21 oh, how do you do this? 0:27:23 And like learn it all again. 0:27:25 They even have stuff too for like the command line 0:27:27 where it’ll help you like doing command line stuff. 0:27:28 It gets really tedious, 0:27:30 especially if you don’t do it every day. 0:27:32 If you do it every day, you learn the commands. 0:27:33 It’s not bad. 0:27:34 But like, if you do it like once a month, 0:27:36 like me, like you kind of forget some stuff. 0:27:37 – For sure. 0:27:39 – It can just like, oh, here’s what you’re trying to do. 0:27:40 And here’s the command. 0:27:43 – Every tool out there has like an API 0:27:46 and every API is sort of slightly different 0:27:47 from every other API. 0:27:48 – Yeah. 0:27:50 – Yet somehow when you use cursor, 0:27:52 it seems to know how to implement the API 0:27:54 for the tool you’re trying to implement, right? 0:27:56 Like if I’m trying to implement 0:28:00 the whisper speech to text feature in an app, right? 0:28:03 It knows how to just use the open AI API. 0:28:08 When I was trying to get it to do like a whizzy wig editor, 0:28:09 right, where I can like, you know, 0:28:12 make fonts bold and stuff in my text, 0:28:15 it just automatically knew how to pull in the proper APIs 0:28:16 for that kind of stuff, right? 0:28:19 So that to me is also really, really impressive. 0:28:22 It’s like, you don’t need to go to 0:28:23 whatever tool you’re trying to integrate with 0:28:26 and learn all about how that API works. 0:28:29 It seems like cursor will go and do that work for you 0:28:31 to figure out how that API works. 0:28:33 – Yeah. And I think this is an example too of like, 0:28:34 when people are making the AI startup, 0:28:37 they should be looking at cursor and perplexity. 0:28:42 Like model agnostic, ship fast, solve a specific problem. 0:28:44 – Yeah. And then we got replet, 0:28:45 which I have used replet a little bit, 0:28:50 but to me, replet was more of just like that IDE, right? 0:28:51 It was more of like a replacement 0:28:53 for visual studio code for me, 0:28:55 but it was saving it all to the cloud 0:28:57 instead of saving it all to my computer. 0:28:58 – Well, yeah, on that side it’s like, 0:29:00 it’s like on the, it’s for like deployment, right? 0:29:01 Like they’re kind of a competitor 0:29:03 with like a VERSEL and some of these other startups 0:29:05 that help you easily deploy websites. 0:29:08 But I’m specifically wanting to rank the replet AI agent, 0:29:10 which is like, is a newer thing that they have. 0:29:14 So I probably put it in like a C tier right now 0:29:16 with the potential to be an S tier. 0:29:18 Like I tried it, it’s cool. 0:29:20 I mean, at least you go from like start to finish. 0:29:23 Like you can literally, I sat down with my son 0:29:24 and he had a ton of fun. 0:29:26 We basically just sat down, 0:29:28 we gave it some ideas of what we wanted to build, 0:29:29 like a website. 0:29:32 He wanted to do something that was a Minecraft 0:29:33 and League of Legends. 0:29:36 And it was like a guide for these two or something, you know? 0:29:38 And it went from beginning to end 0:29:40 and it shows you the code too. 0:29:41 And so you can dive in more 0:29:43 and see the actual code it’s generating as well if you want. 0:29:45 It takes a long time. 0:29:46 Like a lot of times it was taking 0:29:48 like three to five minutes to generate the stuff. 0:29:51 But it helps you create the entire website 0:29:52 and then deploy it. 0:29:54 So it’s like from beginning to end, 0:29:57 the quality was not as good as if I wouldn’t, 0:29:59 if I would have just used my own template 0:30:00 and then probably used cursor to help me and stuff, 0:30:02 I probably would have got something better out. 0:30:04 But you could see longterm where 0:30:06 there will be a class of people who, 0:30:08 if you have an idea for a startup, 0:30:10 probably the best way in like a year from now 0:30:13 will be to go to something like the Replay AI agent 0:30:15 and say, make me this landing page. 0:30:17 It’s gonna make you a beautiful landing page. 0:30:18 It’ll deploy it for you. 0:30:21 The whole thing will be done in 10 minutes. 0:30:22 Yeah, yeah, yeah. 0:30:22 Right, right. 0:30:25 That’s gonna be probably possible in the next year. 0:30:26 And then it’ll be S tier. 0:30:27 Or you can just sketch something out on a piece of paper, 0:30:30 like scribble out like a rough layout of what you want 0:30:32 and then upload that image and say, 0:30:34 make me a landing page. 0:30:35 Here’s a sort of quick template. 0:30:37 And then it’ll make a beautiful version 0:30:39 of what you, you know, you sketched out. 0:30:41 Something that we saw, you know, 0:30:43 back at the GPT for demo, 0:30:46 but it’s never really panned out 0:30:49 to be like what we’ve seen in the demos yet. 0:30:50 Right. 0:30:51 Okay, I’ll put it there 0:30:53 because I can’t, or you otherwise. 0:30:58 So let’s, let’s, let’s do like the AI image generators now. 0:31:01 Let’s start with probably the most well-known 0:31:03 with mid-journey. 0:31:06 Where would you rank mid-journey? 0:31:07 F tier? 0:31:08 (laughs) 0:31:10 Yeah, totally. 0:31:12 Some of our friends are gonna kill me if I do that. 0:31:13 God, I have a hard time ranking it. 0:31:15 Like there’s like, part of me that wants to say S 0:31:16 and there’s part of me that wants to say A. 0:31:17 I guess that’s kind of like- 0:31:19 Honestly, I’ll be honest with you. 0:31:20 I’m gonna struggle with the AI art tools 0:31:24 ’cause like I actually use them all for different reasons. 0:31:25 Yeah. 0:31:26 I mean, I used to, you know, 0:31:29 when I first started dabbling with AI art, 0:31:30 I used stable diffusion 0:31:32 ’cause I could run it locally on my machine. 0:31:34 And I just thought that was so cool. 0:31:36 And then I started using mid-journey 0:31:38 and I was like, oh yeah, you get, you know, 0:31:41 it’s easier to create beautiful things with it. 0:31:43 I hated that you had to use it on Discord. 0:31:44 It took them forever to get the web version. 0:31:46 I thought that was so dumb. 0:31:47 Well, I would think mid-journey, 0:31:50 I think mid-journey kind of argued that like, 0:31:51 they were more of like a research company 0:31:53 building up a really good art tool. 0:31:55 And they didn’t, they were less concerned 0:31:58 about the user experience at that point, right? 0:31:59 They were just concerned 0:32:02 with making the best image model they could. 0:32:03 This is our user experience. 0:32:05 Live with it or go somewhere else. 0:32:06 We don’t really care. 0:32:08 We’re gonna make the best image generator there is. 0:32:11 And only recently have they started putting like effort 0:32:13 into user experience. 0:32:15 Yeah, I’m kind of convinced that the image generation part 0:32:16 is gonna become a commodity 0:32:18 and the user experience is a thing. 0:32:21 So I think I would put them at A. 0:32:22 I think a lot of people would put them at S, 0:32:24 but I think I’d put them at A. 0:32:25 Just ’cause I think- 0:32:26 – I would agree with A, I think. 0:32:29 – Yeah, I’m not using it as much as I used to. 0:32:30 There’s great alternatives. 0:32:33 I think they ship the website 0:32:35 and the new user experience way too slow, 0:32:38 which makes me more pessimistic on them long-term. 0:32:39 – Yeah, I think, you know, 0:32:40 if I’m going back to how I was sort of 0:32:43 ranking things earlier on in this episode, 0:32:45 I was kind of ranking them by like, 0:32:47 how often I would go to that tool 0:32:50 opposed to the other tools, right? 0:32:52 Mid Journey isn’t really one that I go to 0:32:56 is like one of my top, you know, two or three anymore. 0:32:59 Maybe it’s top three, but it’s not top two, right? 0:33:02 Top two for me would be Leonardo and Stable Diffusion 0:33:04 are the two that I use the most. 0:33:07 I don’t know, I would probably put Mid Journey as B tier. 0:33:09 – Oh, really? Okay. Yeah. 0:33:11 – But I could probably be convinced otherwise. 0:33:13 The thing about Mid Journey is it’s like, 0:33:15 it’s the most well-known. 0:33:18 It’s the one that I feel like at least for a long time, 0:33:21 like ChatGPT was setting the bar with AI Image Generation. 0:33:22 Like everybody was going, 0:33:24 oh, this one’s almost Mid Journey tier. 0:33:26 Oh, have you seen the new version of Stable Diffusion? 0:33:28 It’s almost as good as Mid Journey. 0:33:30 Have you seen the new ideogram? 0:33:32 It’s almost as good as Mid Journey, right? 0:33:37 Like, so like to me, it gains some points as being that one 0:33:39 that sort of bent everybody’s benchmark 0:33:41 of like trying to reach, 0:33:46 but I don’t really find myself using it as often anymore. 0:33:48 And I think it’s just because, 0:33:50 not because I don’t think Mid Journey is very good. 0:33:52 It’s just because I think some of the other models 0:33:55 have gotten better than Mid Journey at this point. 0:33:57 – Yeah, and people are saying that they’re going to move 0:34:00 into like AI video and that’s what’s going on. 0:34:02 And also they’re going to move into, 0:34:05 like you’ve created some kind of character persona 0:34:07 on Mid Journey and that can be consistent 0:34:08 across different things. 0:34:09 So I guess you can make like comics and stuff. 0:34:11 So maybe they will, 0:34:13 maybe they are ahead in that area and we don’t know, 0:34:16 but also recently Sam Altman kind of like had this grin 0:34:19 on his face when he was talking about how much better AI art 0:34:21 in video is going to get soon. 0:34:24 So I have a feeling that behind the scenes too, 0:34:26 like all the other stuff that OpenAI is building 0:34:28 is probably going to give them some edge there too 0:34:29 that people are not going to expect. 0:34:31 Just like when we first all saw it, it’s like, 0:34:33 holy crap, that’s amazing. 0:34:34 They probably have other things that are amazing 0:34:37 that have not been revealed yet, if I had to guess. 0:34:39 – The other sort of like negative against Mid Journey 0:34:41 is of the image generators. 0:34:44 It’s probably the most expensive to use as well. 0:34:45 – Yeah. 0:34:46 I keep canceling it and then re-subscribing. 0:34:48 I’m like, I am not using it and I’m like, it’s expensive. 0:34:49 Then I cancel it. 0:34:50 I’m like, I want to use it again. 0:34:52 I subscribe, but this happened so many times now. 0:34:53 – Yeah, yeah. 0:34:55 So like, you know, for me, 0:34:57 because it’s on the more expensive end 0:34:59 of the AI image generator models, 0:35:03 I personally don’t find myself going to it as much anymore. 0:35:03 That’s why I’d put it there. 0:35:06 I feel like flux might have passed it 0:35:10 or at least caught up with it as terms of realism. 0:35:11 Right? 0:35:14 And when it comes to like artuny, like colorful, 0:35:18 high contrast, like aesthetically pleasing images. 0:35:20 For me, I tend to go to Leonardo a little bit more. 0:35:23 Now I will say that I am an advisor for Leonardo. 0:35:24 Like I feel like- 0:35:26 – So S tier, of course, Leonardo’s S tier, right? 0:35:27 – I need to disclaim that that like, 0:35:30 I do have equity in Leonardo. 0:35:33 But I legitimately do go and use it more 0:35:35 than I use Mid Journey, right? 0:35:39 Like I do go and I don’t think I’m going to put it in S tier. 0:35:40 I’ll probably put it in A tier 0:35:43 just ’cause it’s the one that I use the most. 0:35:45 – Wait, are we finishing up Mid Journey though? 0:35:46 So let’s- 0:35:47 – Oh yeah, okay. 0:35:49 – So I agree with, I had said S or A, 0:35:52 but I think you kind of convinced me of B, honestly. 0:35:53 I think that’s like a good spot 0:35:57 ’cause like they were S tier for a while, now they’re B. 0:35:59 Yeah, that makes sense to me. 0:36:00 – Yeah, and then, you know, 0:36:02 so I was sort of sort of rolling into Leonardo 0:36:04 just because I was like referencing it 0:36:06 in comparison to Mid Journey. 0:36:08 For me, Leonardo is like, 0:36:10 it still has some of the like issues 0:36:11 you might get with stable diffusion 0:36:13 where you get some of the like seven finger hands 0:36:18 or like, you know, like that bird flying up there 0:36:19 looks a little weird. 0:36:20 Why does it have three eyes or whatever? 0:36:22 Like you get some of that weird stuff 0:36:24 that you get with stable diffusion, 0:36:27 but aesthetically like the Leonardo Phoenix model, 0:36:28 which is their proprietary model. 0:36:31 It’s not like one of the stable diffusion models. 0:36:33 I think they’ve gotten, it’s gotten really, really good. 0:36:36 It’s like, if you look at my YouTube thumbnails, 0:36:37 the way I make those thumbnails 0:36:41 is I usually generate an image in Leonardo first 0:36:42 and then I take that image 0:36:44 and then I pull it into stable diffusion 0:36:46 that I have locally installed on my computer 0:36:49 and I do a face swap with stable diffusion. 0:36:52 So it’s like the image was made in Leonardo 0:36:54 and then stable diffusion is what I use 0:36:56 to sort of mask out my face 0:37:01 and then put my real face on the image that was generated. 0:37:03 So I’m using that one constantly 0:37:05 ’cause it’s what all the thumbnails are made with. 0:37:06 But I don’t know. 0:37:08 Have you, have you used Leonardo before? 0:37:10 – Yeah, I have like twice. 0:37:11 Honestly, it’s been a while 0:37:13 so I definitely haven’t tried the Phoenix model. 0:37:15 My impression was like when I tried it, 0:37:17 I was like surprised how good it was to be honest with you 0:37:19 ’cause I felt like I hadn’t heard much about it 0:37:21 except from you honestly. 0:37:23 Like I had heard about it a little bit on the X 0:37:26 but like I was like nowhere near as much as mid-journey. 0:37:27 And when I tried it, when I, 0:37:29 this was probably a year ago now. 0:37:31 So it’s probably way better since then. 0:37:35 I felt like it was 95% as good as mid-journey 0:37:37 in terms of quality, which surprised me. 0:37:40 I was like, oh, this is pretty close to mid-journey 0:37:42 and I liked the interface a lot better. 0:37:46 Like I thought it was a cooler app to use honestly. 0:37:47 I think I just had the mid-journey subscription 0:37:49 so I just ended up not going back to Leonardo. 0:37:51 That’s honestly like kind of what happened, but. 0:37:52 – When it comes to realism, 0:37:55 I think mid-journey still has Leonardo beat. 0:37:56 You know, I think flux in mid-journey 0:37:59 are better at realism than Leonardo. 0:38:00 But when it, like, do you remember back 0:38:03 when like mid-journey was on like version three 0:38:05 and they had these like really like vibrant, 0:38:08 like contrasting colors and every image you looked 0:38:10 had like this mid-journey aesthetic 0:38:12 that just had like this cool look to it, right? 0:38:15 That’s how I feel about the Leonardo Phoenix model 0:38:19 is it’s like, it’s got this high contrast HDR 0:38:22 sort of like really, you know, the colors really pop. 0:38:24 It’s got like that aesthetic to it, 0:38:26 which to me is really, really pleasing. 0:38:28 I feel like with mid-journey, 0:38:30 like each version that’s come out of mid-journey, 0:38:33 they’ve sort of moved away from that like mid-journey aesthetic. 0:38:35 And now they just kind of look like, you know, 0:38:37 whatever you’re prompting, 0:38:39 like whatever style you’re prompting, 0:38:42 Leonardo to me still has like a style, I guess. 0:38:45 – Yeah, so you are using it for your thumbnails 0:38:47 on a daily basis, is that, that’s right? 0:38:48 – Well, I don’t put out a new video every single day. 0:38:50 – Not daily, not daily, yeah, but yeah. 0:38:51 – Every video I put out, yeah, 0:38:54 I use Leonardo for the thumbnail, yeah. 0:38:57 – Yeah, I’m kind of like, I feel like I could see A, 0:38:59 but also like people are gonna freak out about that 0:39:01 and saying Leonardo over mid-journey. 0:39:02 And also since you’re an advisor, 0:39:05 I’m like, damn, I should probably put it as a B. 0:39:08 – I’ll put it as a B, just as a precautionary measure. 0:39:11 I, you know, so here, 0:39:12 just kind of thinking it through here, right? 0:39:14 I think mid-journey and Leonardo 0:39:17 should be kind of on the same level. 0:39:19 Maybe like, if we’re right and I’m like this way too, 0:39:22 so like this is a higher B maybe, 0:39:25 but like they’re probably about as good as each other. 0:39:26 Like I would go to mid-journey 0:39:28 if I need something a little bit more realistic, 0:39:29 I would go to Leonardo. 0:39:31 If I’m looking for that sort of like aesthetic, 0:39:32 the color’s gonna pop, 0:39:34 it’s gonna make a good thumbnail kind of thing. 0:39:35 – Yeah. 0:39:36 – But they’re probably on par with each other. 0:39:38 – Okay, I could see that. 0:39:41 – All right, so I put stability AI here, 0:39:43 but this is meant to be for stable diffusion. 0:39:46 Stable diffusion itself doesn’t really have a logo 0:39:49 and everybody kind of like associates stability AI 0:39:50 with stable diffusion. 0:39:52 I mean, so for me, stable diffusion, 0:39:54 I’m probably just gonna put it right alongside 0:39:57 these other ones ’cause I use it just as much 0:39:58 as I use Leonardo, right? 0:40:01 Like that’s what I use for like my face swapping. 0:40:02 Maybe I put it as a C tier actually, 0:40:05 just because like I don’t use it to generate images, 0:40:08 like almost ever I use it to do like in painting. 0:40:12 So I generate the image with a mid-journey or a Leonardo 0:40:14 or like one of those tools first. 0:40:16 And then the only thing that I’m using stable diffusion 0:40:19 for at this point is to go in, mask out my face 0:40:23 and then like superimpose my face using AI. 0:40:24 So I can’t, I don’t know, 0:40:26 I don’t think I can put it on the same level 0:40:28 as these other ones, but- 0:40:30 – I was gonna say F, which is like probably getting me killed 0:40:31 by like everybody who- 0:40:33 (both laughing) 0:40:36 But I mean, I feel like they dropped the ball so much. 0:40:38 Like early on- 0:40:39 – Yeah, but I think you’re referring to stability AI, 0:40:41 the company, right? 0:40:42 – Yeah, okay. 0:40:43 Yeah, true. 0:40:45 – The stable diffusion is like an open source 0:40:47 image generation model. 0:40:50 The ability AI has made a handful of the weights 0:40:54 that have been made available like SDXL, SD 2.0, SD 3.0. 0:40:56 Like those were made by stability AI, 0:40:59 but stable diffusion itself was an open source model 0:41:01 that was built before stability AI even existed. 0:41:05 – Right, which was super, like I feel like 0:41:06 there wasn’t a whole lot of transparency around that. 0:41:07 So yeah, I’m probably judging the company, 0:41:10 not the model itself. 0:41:13 Cause like I have a lot of feelings about the company 0:41:16 in terms of the statements that were made to investors 0:41:17 and other things. 0:41:19 – When SD, was it 3.0 came out 0:41:21 and everybody was generating the images 0:41:22 of people like laying in the grass 0:41:24 and it would have like these mutated people 0:41:27 that had like only legs and no head 0:41:29 and you know, random stuff like that. 0:41:30 – Yeah, yeah, yeah. 0:41:31 I guess I was also like, you know, 0:41:34 early on I was using stable diffusion a lot. 0:41:36 And I just, I personally don’t use it anymore, 0:41:38 but it sounds like you do actually have a friend 0:41:39 who has a company as well. 0:41:41 And I know he, like he has like a discord bot 0:41:42 or something like that. 0:41:43 I know he uses it. 0:41:45 So it sounds like people are using it. 0:41:47 So probably, yeah, it’s probably not an F. 0:41:48 – Well, I guess if we’re looking at that as a whole, right? 0:41:51 Like most of the models that people are gonna use 0:41:54 for stable diffusion today are the models 0:41:56 that were generated by stability AI. 0:41:59 So I guess like, you can kind of lump it in, you know? 0:42:00 Well, just as we’ve been talking about this, 0:42:03 I’ve already knocked it down two tiers. 0:42:04 – Okay, okay. 0:42:06 Yeah, DC, I think DC seems good. 0:42:07 Like it’s gonna be controversial. 0:42:08 Some people are gonna say it should be like an A. 0:42:10 I don’t think anyone’s gonna say S. 0:42:13 – Well, again, my logic on it is like, 0:42:15 how often do I find myself going to it 0:42:16 when I need to generate stuff? 0:42:18 And I do use it often, 0:42:20 but I don’t use it to generate the images. 0:42:21 Like the only value I’m getting 0:42:26 from stable diffusion right now is that face swapping feature. 0:42:28 – Okay, D, let’s leave it at D. 0:42:29 We have to have something that’s lower. 0:42:30 I mean, like. 0:42:33 – But looking at what’s left, what would go lower? 0:42:34 – Yeah, yeah, that’s true. 0:42:35 That’s true. 0:42:37 We may not get anything at F. 0:42:39 Like we’ll just like the people who make it. 0:42:40 – We’re just too nice. 0:42:41 Yeah, we don’t want to burn any bridges 0:42:43 with any companies completely. 0:42:46 So it’s like hard to throw anything below a D. 0:42:47 All right, but stable diffusion itself, 0:42:51 let’s leave it in D because it is open source. 0:42:54 You can literally generate anything 0:42:55 if you’re running into like roadblocks 0:42:59 with censorship or whatever with any of the other AI models. 0:43:03 All right, so this next one here, this is Playground AI. 0:43:05 Quite honestly, I haven’t used it a ton, 0:43:07 but I did use it recently. 0:43:10 They’ve sort of rebranded a little bit 0:43:14 to being like the AI generation tool 0:43:16 for like graphic designers, right? 0:43:18 Like they’ve made it really good 0:43:21 at creating almost like vector style art. 0:43:22 Like it’s not actually generating vectors, 0:43:25 but it’s generating like vector style art, 0:43:28 like logo type stuff and stuff that would be good 0:43:31 on like t-shirts and mugs and stuff like that. 0:43:34 It seems like they’ve kind of gone more that direction 0:43:36 and sort of niched into it, right? 0:43:39 Like you’ve got like scenario.gg, right? 0:43:42 They’ve really, really like leaned into like game assets 0:43:46 where Playground, they’ve really seemed to lean into like 0:43:47 where what you go to if you want to create logos 0:43:51 or graphics for like your company assets 0:43:53 and icons and things like that. 0:43:57 And for that use case, it seems pretty dang good, 0:43:58 but I haven’t used it a ton. 0:44:00 Like that was me playing around with it 0:44:02 for like 30 minutes or so. 0:44:04 – Having a hard time ranking this one 0:44:05 for like multiple reasons. 0:44:06 Like I haven’t tried it recently. 0:44:09 I tried it before, haven’t tried it recently. 0:44:09 Next time we do this, 0:44:12 I probably should try everything before we do this. 0:44:15 And you know, the founder, Sue Hale, 0:44:18 I have a lot of mutual friends with him in Silicon Valley. 0:44:19 He created Mixpanel back in the day, 0:44:22 which was like, was for a while considered 0:44:23 like the coolest analytics tool. 0:44:25 Like instead of using Google Analytics, 0:44:27 you use Mixpanel if you wanted more details 0:44:29 and the stuff that mattered to you. 0:44:32 So, you know, he’s definitely a great founder, 0:44:34 but when I did try it, 0:44:37 like the user experience and interface was like really good. 0:44:42 It was like that and Leonardo were like two of the ones 0:44:43 I thought were pretty cool. 0:44:47 – I mean, just kind of going back on my sort of fallback 0:44:51 like logic with how I ranked everything else. 0:44:53 I don’t really go to that one very often. 0:44:56 I mean, I use stable diffusion more than I use playground. 0:44:58 But I also think that this should probably still rank 0:45:00 above stable diffusion. 0:45:03 ‘Cause as far as like people listening to this episode, 0:45:07 it’s a lot more user friendly than stable diffusion. 0:45:08 Like you’re probably more likely to go, 0:45:10 like unless you’re using it for a very niche use case, 0:45:12 like I am with stable diffusion 0:45:15 where I’m going and face swapping for YouTube thumbnails, 0:45:17 you’re probably not gonna use stable diffusion, right? 0:45:19 You’re probably gonna prefer something like playground, 0:45:22 which has stable diffusion models built into it. 0:45:24 So you could just select the model 0:45:25 and get playground to generate the image 0:45:27 with stable diffusion for you, right? 0:45:31 Also, it’s similar to like the AI version 0:45:35 or the AI art version of like a perplexity or a cursor 0:45:37 where you can go into playground. 0:45:40 You can use playground model V3, 0:45:41 but you can also choose Dolly three 0:45:46 or you can also choose stable diffusion SDXL or SD 3.5. 0:45:49 Like you could go and pick whatever model you want to use 0:45:52 or you can use their own like proprietary built-in model. 0:45:56 So I think like that gives it more value 0:45:57 than stable diffusion right off the bat 0:46:00 ’cause it does what stable diffusion does plus, you know? 0:46:02 – Yeah, and anyone could listen to this 0:46:03 and just go instantly try it 0:46:05 and get the gist of it in like a few minutes. 0:46:08 – But as far as like the images that it generates, 0:46:10 I don’t think I’d put them on the same level 0:46:12 as Leonardo’s mid-journey. 0:46:15 I like, I don’t think it’s up to par with those quite. 0:46:18 Like it’s close, but it’s not quite as good as those 0:46:20 like with the quality of image. 0:46:23 – Yeah, the last time I saw images from it, 0:46:26 it seemed like kind of closer to like stable diffusion level. 0:46:27 – Yeah, yeah. 0:46:29 And then we’ve got, just moving on, 0:46:30 we’ve got Adobe Firefly. 0:46:33 This might be my first E tier. 0:46:36 Here’s my problem with Adobe Firefly. 0:46:38 Out of all the image generation models, 0:46:41 it is the absolute hands down most censored, right? 0:46:43 Like I tried to get it to generate an image 0:46:44 of the Eiffel Tower, 0:46:47 and it said, “No, we can’t generate that for you.” 0:46:50 And I went and looked it up and apparently images 0:46:53 of the Eiffel Tower are technically trademarked 0:46:54 or copyrighted or something like that. 0:46:57 Like it wouldn’t generate an image of the Eiffel Tower. 0:47:00 When I get it to generate images of actual people, 0:47:01 it’s not great. 0:47:03 Like the realism isn’t quite there. 0:47:06 When it comes to like aesthetics and generating images 0:47:09 that like I just think are aesthetically pleasing 0:47:10 with like the color palette and the contrast 0:47:13 and stuff like that, never very impressed. 0:47:16 Like every time I’ve gone to Adobe Firefly to go try it 0:47:19 to see if like, okay, has this one gotten better 0:47:21 than it used to be? 0:47:24 It doesn’t really, like the UI has gotten a lot better. 0:47:27 Like the UI, I have no beef with the UI. 0:47:28 I think it’s a decent UI. 0:47:31 I just really, really don’t like the images it generates. 0:47:33 And it is so insanely censored 0:47:35 that like I’m always sitting here banging on my desk. 0:47:37 I was so frustrated that some of the images 0:47:38 it wouldn’t generate. 0:47:39 – Yeah. 0:47:41 So I only saw examples of it then. 0:47:42 I haven’t tried. 0:47:45 I used generative fill and some of the other Photoshop features 0:47:47 they rolled out, which are pretty cool. 0:47:48 But yeah, it sounds pretty bad. 0:47:50 Like I didn’t see any examples. 0:47:52 They looked great either, like on Twitter or anything. 0:47:54 Like there was nothing that was like, oh, that’s awesome. 0:47:59 And to think about like Photoshop for, it’s a creative tool. 0:48:02 It’s like, how can you put such a restriction 0:48:05 on a tool that creators use on a daily basis? 0:48:07 Like there’s no logic there for that. 0:48:10 Cause like, maybe they’re doing it for legal reasons, 0:48:12 but like, it’s like saying that you’ve got a hammer 0:48:13 and you can’t use a hammer in certain ways. 0:48:16 Like, well, that’s, why would I buy that hammer? 0:48:19 It’s like, you know, it’s a, like you can use Photoshop 0:48:21 to draw anything, right? 0:48:22 Like so. 0:48:24 – Well, you’re bringing up some points here 0:48:25 that actually want to touch on. 0:48:29 Like Adobe claims to be like the only AI, 0:48:31 like ethical image model, right? 0:48:33 Because they’ve only trained on like images 0:48:35 that are inside of Adobe stock 0:48:38 and like public domain photos, right? 0:48:43 So they claim that like, you’re not generating on a model 0:48:46 that was trained on like everybody else’s artwork 0:48:49 because they had the license to use all of the images 0:48:51 in their training data. 0:48:53 So the people that are really, really concerned about the, 0:48:56 like the ethics of how the models were trained 0:48:58 would probably go to Adobe over the others 0:49:02 because of that sort of like ethical, you know, 0:49:04 piece that they put on there. 0:49:08 Also, you mentioned that Firefly in Photoshop, 0:49:09 that has actually been useful. 0:49:11 Like, I kind of forgot about the fact 0:49:12 that it’s actually using Firefly 0:49:14 when you’re using Photoshop. 0:49:16 And I do every once in a while use Firefly 0:49:17 to like fix stuff, right? 0:49:20 Like I’ll generate an image with Leonardo 0:49:21 or Majorna or something like that. 0:49:23 And there’ll be like a weird artifact in it. 0:49:26 And I’ll pull it into Photoshop, 0:49:28 circle that little weird artifact, 0:49:30 click the generate, generate a fill button 0:49:32 and it’ll like remove that for me, right? 0:49:34 And it works at that really, really well. 0:49:35 But even saying that, 0:49:37 the censorship is so bizarre sometimes. 0:49:40 – I honestly think this should be our first F tier 0:49:43 because like a creative tool that doesn’t allow you 0:49:44 to create things you wanna create. 0:49:46 And also I haven’t seen anyone sharing good examples 0:49:47 of using it. 0:49:49 I haven’t heard of anyone actually using it. 0:49:51 – Well, everybody uses it in generative fill though. 0:49:54 Like if you’re using Photoshop with generative fill, 0:49:56 that’s where you’re most likely gonna use it, right? 0:49:57 – Yeah. 0:49:58 – And that’s where I find myself using it 0:49:59 from time to time. 0:50:01 But even there, like I’ve had times where like, 0:50:03 it would be my face in an image 0:50:05 and I would circle my eyes and say like, 0:50:06 put sunglasses on me. 0:50:07 And it would say like, 0:50:08 oh, our guidelines don’t allow that. 0:50:12 Like sometimes the guidelines just present false positives 0:50:13 or false negatives 0:50:15 or you know, whatever direction that should go. 0:50:18 It thinks you’re trying to generate something 0:50:20 that you’re not actually trying to generate, right? 0:50:22 And sometimes it’s just like really weird like that. 0:50:25 Or like, like I’ve had issues where like, 0:50:27 I had like a six fingers on one of the hands 0:50:28 that it generated, right? 0:50:30 And I was like, circle one of the fingers 0:50:31 and it’ll be like, oh, sorry. 0:50:32 This is against our guidelines. 0:50:34 We can’t change this image 0:50:36 ’cause it obviously thinks the finger is like, 0:50:38 you know, a different appendage or something. 0:50:41 And so like, you run into that kind of issue with it. 0:50:42 But I have a hard time putting in F tier 0:50:44 because I do find it useful. 0:50:46 – But we should treat it as like a standalone 0:50:47 versus a generative fill. 0:50:49 Like if we had generative fill, 0:50:50 maybe we’re talking like a B tier or something 0:50:53 like that feature in Photoshop. 0:50:56 But for like a standalone image generator, 0:50:59 I think we have to judge it that way 0:51:00 to get like an accurate judgment on it. 0:51:02 And I do think that that would be an F. 0:51:03 – Yeah. 0:51:05 You know, and another negative for Adobe too 0:51:08 is like their pricing model 0:51:10 has always spelt a little predatory, right? 0:51:11 Like where they do the whole thing 0:51:13 where it’s like, it’s this much per month, 0:51:15 but you have to agree to a year long contract 0:51:17 to get that at this much per month. 0:51:18 And then people don’t use the tool anymore 0:51:19 and they can’t get out of there. 0:51:21 It’s just, that to me feels very icky. 0:51:23 And I’ve always hated that about Adobe. 0:51:24 – Yeah. 0:51:27 There’s been once or twice where like unintentionally 0:51:29 I end up paying them like a 500 bucks or something. 0:51:30 – Yeah. 0:51:31 – And it’s like, that sucks. 0:51:33 – All right, so next up is Magnific. 0:51:36 I haven’t actually used Magnific to generate images, 0:51:38 but I’ve used their upscaler quite a bit. 0:51:39 And I think it’s pretty cool. 0:51:41 But I think the big complaint about Magnific 0:51:42 is the cost, right? 0:51:46 A lot of people say it’s like for an upscaler, 0:51:48 it’s like kind of pricey for that. 0:51:50 But what are your thoughts on it? 0:51:52 – Yeah, I got pretty crazy. 0:51:55 That’s my problem as well. 0:51:57 Is they did give me some credits 0:51:58 so I could play around with it. 0:52:00 But like, whenever I’ve talked about it 0:52:01 on a YouTube video or something like that, 0:52:05 people are always like, Magnific, prices is crazy, you know? 0:52:06 – Right. 0:52:07 I haven’t used it recently. 0:52:09 So I’m gonna have a hard time giving it 0:52:11 like a really high ranking. 0:52:13 But when I used it, it seemed great. 0:52:14 Like upscaling was always one of the issues. 0:52:16 Like you’d use different AI art tools. 0:52:18 And then like there was, it was crazy 0:52:20 that there was no good upscalers. 0:52:21 It was like, why is it? 0:52:23 How’s this still an issue? 0:52:24 And then the fact that, 0:52:25 and then it was kind of cool 0:52:26 how it would actually make them look better. 0:52:27 Like some stuff you put in there 0:52:29 and all of a sudden look nicer. 0:52:30 Which sometimes that was bad too. 0:52:32 ‘Cause sometimes it would actually change 0:52:34 the essence of the image as well. 0:52:36 So I had a hard time kind of like playing with the tweaks 0:52:38 on that and stuff like, how do you get that right? 0:52:40 – It upscales, but it also hallucinates. 0:52:42 Like it hallucinates on purpose, right? 0:52:43 Like it will add extra stuff 0:52:45 that wasn’t in the original image. 0:52:48 But it creates this like cool effect. 0:52:49 Like I’ve always thought it looked really cool. 0:52:52 Like we’ve seen those things that circle on Twitter 0:52:54 and stuff where it’s like Laura Croft 0:52:55 from like the original Tomb Raider game 0:52:57 and somebody upscaled it with Magnific. 0:53:01 And then it looks like a triple A game graphics. 0:53:02 I wouldn’t say it looks like ultra realistic, 0:53:04 but it looks like modern day game graphics 0:53:06 with the upscale done to it. 0:53:09 Like some of that kind of stuff is really, really cool. 0:53:12 But to me, it falls into the category of like novelty. 0:53:14 Like I don’t use it a ton. 0:53:16 – Yeah, I thought you were gonna say like the new Lara Croft 0:53:18 and then it changed it to more like the original Lara Croft. 0:53:20 – Well, I’m sure you can go that direction 0:53:20 if you want it as well. 0:53:24 (all laughing) 0:53:24 – Yeah, I don’t know. 0:53:28 I’m feeling like probably, geez, where would we put it? 0:53:30 Should we put it like C or B? 0:53:31 – See, I’m going B or C. 0:53:33 I’m probably leaning more towards C 0:53:36 ’cause if I’m looking at like the value of the tools in B, 0:53:39 I don’t think I put Magnific at like the same level 0:53:41 of value of these tools. 0:53:43 – And it’s not to like disclose. 0:53:45 We probably both have talked to, what’s his name? 0:53:47 Javi, is that how you say it? 0:53:47 – Javi, yeah, yeah. 0:53:52 I’ve talked to him several times on Twitter, nice guy. 0:53:54 I was kind of shocked that he built this 0:53:56 ’cause like I just knew him as like this guy on Twitter. 0:53:58 And it was like, awesome that he built this amazing startup. 0:54:00 So it’s like props to him, 0:54:01 but he’s already sold to the company now. 0:54:03 – It’s got the FreePik, right? 0:54:04 So now it’s owned by FreePik, 0:54:06 which FreePik is an AI image generator 0:54:09 that also has Magnific built into it. 0:54:10 – Yeah, so we can talk more crap about it 0:54:11 since he’s already sold it. 0:54:12 (all laughing) 0:54:15 No, so I would say C ’cause it’s expensive 0:54:20 and just long-term, I don’t see the value long-term 0:54:22 just because it seems like, you know, 0:54:24 and I think I may have even said this to him. 0:54:25 “Yeah, you should sell it, like if you can.” 0:54:27 (all laughing) 0:54:29 Because like it took off like crazy fast 0:54:30 and it’s like, but like long-term, 0:54:33 how does that become its own company or own product? 0:54:36 Like it’s an upscaler, like a cool-upscaler. 0:54:37 – Yeah. 0:54:39 – That should be part of mid-journey or Leonardo. 0:54:41 – We’ve kind of covered the AI art. 0:54:43 Now we’re sort of shifting into AI video here. 0:54:46 We’ve got Runway, Pika, Kling and Luma, 0:54:48 ’cause Luma’s got Dream Machine 0:54:50 and Dream Machine was pretty damn groundbreaking 0:54:51 when they showed it off. 0:54:53 But let’s start with like, let’s start with Runway. 0:54:57 For me, Runway, for me, Runway is an S tier. 0:55:02 And my reasoning for that is like, Runway just ships, man. 0:55:04 In the last couple of weeks, 0:55:05 we saw Runway ship act one, 0:55:07 which was like the sort of lip sync. 0:55:10 You can, and it follows the emotions of a character 0:55:12 so I can make a video where I’m talking 0:55:15 and I’m moving my eyes around and every time I blink, 0:55:17 the animation on the video blinks, it does all that. 0:55:22 And then they released the multiple camera angles 0:55:25 where you can make a video from like any angle 0:55:27 of an image that you upload in there. 0:55:30 And they, and that’s really cool. 0:55:32 And then they’ve got like frame interpolation 0:55:34 for like images where you can take one image 0:55:36 and another image and have it like frame interpolate 0:55:38 between them and do this like morphing effect. 0:55:40 And they’ve got the ability to go 0:55:42 and like remove the background of any video 0:55:44 and turn it into like a green screen video. 0:55:47 And like to me, when it comes to AI video, 0:55:51 like every week, Runway’s got some sort of cool new feature. 0:55:53 They just, they just ship and they ship and they ship. 0:55:55 And the generated videos that come out of Runway 0:55:58 are really impressive to me. 0:56:00 When I’m going to generate up an AI generated video, 0:56:03 I’m either using Runway or Luma most of the time. 0:56:06 And for me, I’d put it in S tier. 0:56:08 – Yeah, I get the S tier. 0:56:10 – I just don’t think it’s on the same level 0:56:12 as those other three companies 0:56:16 in terms of like actual value that can be created with it. 0:56:18 Like as of right now. 0:56:19 – That makes sense. 0:56:21 And I’m biased ’cause I do a lot with video, right? 0:56:23 So like I’m playing with video tools a lot. 0:56:25 I’m sure a lot of people that listen or watch this, 0:56:29 you know, this podcast don’t do AI video as much as I do. 0:56:33 But when it comes to the AI video platforms 0:56:35 that are out there, to me, Runway always seems 0:56:37 to be like one step ahead. 0:56:39 Again, we don’t know where open AI is with video. 0:56:42 Like they haven’t released it to the public. 0:56:44 You know, we talked to Don Allen, Stevenson, 0:56:45 the third on our podcast, 0:56:47 who is somebody who actually got to use Sora. 0:56:49 And basically he was like, 0:56:50 I can’t give you more details, 0:56:52 but it’s better than you think, right? 0:56:57 So like there’s good stuff coming out of open AI 0:57:00 in that video front at some point. 0:57:02 But as far as tools we actually have access to 0:57:03 and can play with right now, 0:57:05 I think Runway sort of leading the pack 0:57:07 with AI video at the moment. 0:57:08 So I think A makes sense. 0:57:10 All right, so next up you got Pika. 0:57:12 There’s no way I’m putting Pika in the same 0:57:14 like realm as Runway. 0:57:18 Pika generates AI generated videos as well. 0:57:20 The videos are not great. 0:57:23 Like I don’t really feel like they’ve got great realism, 0:57:27 but their new sort of thing is like, you know, 0:57:29 you see the videos where like there’s a giant 0:57:32 like compressor that compresses somebody 0:57:34 or like the is it cake kind of thing 0:57:36 where a knife comes in and like slices, 0:57:37 whatever you’re looking at, 0:57:38 it makes it look like it’s cake 0:57:40 or like they got that like balloon feature 0:57:42 where it’ll blow up really big and then like float away. 0:57:45 And it’s got like all those kind of like gimmicky 0:57:49 like like effects that you can put on videos. 0:57:50 And that’s kind of fun to play with. 0:57:53 There’s not a whole lot of use out of it 0:57:58 out of like out of like memes, but it’s kind of fun. 0:57:58 Yeah, yeah. 0:58:01 And when they first came out, I got, I got like a demo. 0:58:03 I’m not sure if you did, but like a few like AI influencers 0:58:04 got a demo before other people. 0:58:08 It was at that moment, like for whatever reason, 0:58:10 like cartoon kind of characters, it seemed pretty good. 0:58:12 It seemed like better than some of the other models 0:58:14 for like that kind of art, that kind of style. 0:58:17 But then it seems like there’s been so many other things. 0:58:18 There’ve been all these models from China, 0:58:21 Kling and others that are like probably better looking 0:58:22 for that kind of stuff. 0:58:24 The special effects look cool. 0:58:25 Maybe that’s what they’ll keep doing. 0:58:27 Maybe it’ll be like the special effects 0:58:29 of viral memes kind of company, 0:58:30 but it seems like that’s expensive. 0:58:33 Like are people gonna really pay for that? 0:58:35 Like, you know. 0:58:37 – That’s where I run into like a little bit of struggle 0:58:39 with like ranking some of this stuff is it’s like, 0:58:43 I think it’s really cool and it’s fun to play with, 0:58:45 but I don’t know if I see the real world use case 0:58:47 where like, if you’re paying for it, 0:58:49 you’re gonna get ROI on it. 0:58:49 You know what I mean? 0:58:51 – Yeah, it does seem like that kind of puts it 0:58:52 in like the sea though. 0:58:55 – I think I agree with sea. 0:58:58 I think, I think it’s kind of like a novelty and, you know, 0:59:03 like realistically, most tools should fall under a sea, right? 0:59:05 Like that’s, that should be like our average. 0:59:06 If it’s like a bell curve, right? 0:59:07 So. 0:59:08 – Yeah, it’s cool. 0:59:09 It’s cool. 0:59:10 It’s fun to use. 0:59:14 It’s not the best one out there, but it’s not bad. 0:59:16 I am skeptical about the company longterm 0:59:18 just because I don’t know how they build a business 0:59:20 around that because like, well, the runway, I can see it. 0:59:23 Like if you get baked into Hollywood and stuff, 0:59:25 obviously like, like I’ve told you, 0:59:26 I dealt with Hollywood a little bit. 0:59:28 Like they spend so much money on special effects. 0:59:30 Like some of these films is like $100 million 0:59:31 on special effects. 0:59:33 Like, well, if you could cut that down to a million dollars 0:59:36 or whatever, you know, that’s huge. 0:59:39 – And runway was founded by like legit like AI pioneers. 0:59:42 Like one of the founders, I don’t remember exactly 0:59:44 like names here, but like one of the founders of runway 0:59:46 was one of the original developers of stable diffusion. 0:59:47 – Yeah, yeah. 0:59:49 So I think sea makes sense. 0:59:50 – And then you’ve got Kling, 0:59:53 which I haven’t really used that much. 0:59:55 It’s one of the few AI video tools 0:59:57 that I personally haven’t played with a ton. 0:59:58 – I played with it a little bit. 1:00:00 I have noticed like, you know, 1:00:01 I used to do those AI video threads. 1:00:04 And I have noticed that it seems like right now though, 1:00:07 most of the videos that people are putting out on X, 1:00:09 a lot of them are Kling actually. 1:00:12 So it seems like in terms of visual quality, 1:00:15 they are in the realm of runway is what it seems like. 1:00:17 It seems like they’re in the realm of runway, 1:00:18 but with like the editing tools 1:00:20 not being as good as runway. 1:00:22 – So maybe like a beat here. 1:00:22 – It seems like a beat 1:00:25 ’cause it seems like they’re putting out better videos 1:00:27 than Pica. 1:00:29 – Well, it also seems to me like Kling 1:00:32 is a lot more uncensored, right? 1:00:34 Like, if you went to runway and said, 1:00:36 generate a video of Will Smith eating spaghetti, right? 1:00:38 Like the very classic meme 1:00:39 that’s been given around for years now, 1:00:41 it’ll say we can’t do that 1:00:43 ’cause we’re not gonna make videos of Will Smith. 1:00:44 I believe Kling will. 1:00:45 I think Kling is like, 1:00:46 we’ll make whatever you want to generate. 1:00:47 – Kling’s from China. 1:00:48 They’ll do whatever you want. 1:00:50 (laughing) 1:00:52 And that is one thing I’ve seen people kind of talk about 1:00:55 online recently, like why is, 1:00:56 and there’s another, 1:00:57 there’s one or two other Chinese models too. 1:00:59 We probably should be listing them, 1:01:01 but they’re also pretty good in AI video. 1:01:03 And it’s like, why are they so good at it? 1:01:04 – Is it like how do AI or something? 1:01:05 I didn’t figure out how to pronounce that one. 1:01:06 – Yeah, exactly. 1:01:08 Yeah, that’s why it’s so hard to say that. 1:01:09 I shouldn’t be able to say it. 1:01:10 But those are all pretty good. 1:01:13 Like those are all probably better than Pica right now 1:01:15 or in the ballpark of it. 1:01:16 Pica’s got the cool special effects, 1:01:18 but like they’re in the ballpark of being, 1:01:20 somewhere in between runway and Pica, 1:01:21 which is pretty impressive. 1:01:22 They may end up being like, 1:01:25 having some of the best models out there long term. 1:01:26 – Yeah, I agree. 1:01:28 I think B is the right place for them. 1:01:30 I think they live a little bit above Pica, 1:01:31 but not quite runway level. 1:01:34 So then you’ve got Luma and Luma has their dream machine, 1:01:39 which is probably about on par with what runway does 1:01:42 as far as like the video generation. 1:01:44 Not nearly as many features as runway has, 1:01:47 but Luma’s, the cool thing about Luma dream machine 1:01:50 that I really like is that you can give it two images. 1:01:53 You can give it a starting frame and an ending frame. 1:01:55 And then it figures out how to make a video 1:01:57 that goes between those two frames. 1:02:01 So for example, if I put an image of myself 1:02:03 as the starting image and an image 1:02:06 of like a wolf howling at the moon as like the other image, 1:02:08 and I want it to like animate as me morphing 1:02:11 into a wolf howling at the moon, it can do that. 1:02:14 And like that’s where I feel like Luma really stands out 1:02:17 is the start frame end frame created animation 1:02:19 between the two frames feature. 1:02:21 I really, really think that feature is cool. 1:02:23 It doesn’t put it on par with runway, 1:02:27 but I think it puts it a step ahead of Pica maybe. 1:02:29 – Yeah, probably a B then. 1:02:30 Yeah, I play with it one time. 1:02:31 I liked it better than Pica. 1:02:33 – Well, another thing about Luma is Luma was really 1:02:35 one of the companies that led the way with 1:02:38 nerfs and Gaussian splatting and stuff like that, right? 1:02:41 Like if you have the Luma app on your phone, 1:02:44 you could go like take an object in your room 1:02:45 and like scan it in with your phone 1:02:47 and create a 3D version with the Luma app. 1:02:50 So not only does it have a video generator, 1:02:53 it also can scan in real world objects 1:02:55 and convert them into nerfs and Gaussian splats 1:02:56 and stuff like that, 1:02:59 which I know is getting really technical in the weeds, 1:03:01 but if you’re really into like creating 3D assets 1:03:04 and stuff, Luma is also capable of that. 1:03:06 – Right, yeah, I think B makes sense. 1:03:07 ‘Cause it seems like in a way they’re gonna be competing 1:03:09 probably directly with Pica then too, 1:03:10 like on special effects and things like that. 1:03:13 And it seems like they’re probably already ahead of them. 1:03:15 – Yeah, yeah. 1:03:16 All right, so next we got Descript. 1:03:18 And I know you haven’t really played with this one, 1:03:20 so I’ll just talk about this real quick. 1:03:22 But Descript is a tool where you can upload 1:03:25 any audio or any video file. 1:03:29 It will automatically use AI to transcribe that whole file. 1:03:32 And then you can edit the audio or video file 1:03:33 by editing the text, right? 1:03:35 So if there’s like a sentence that I wanna remove 1:03:37 from an audio or a sentence 1:03:38 that I wanna remove from a video, 1:03:41 I just delete that sentence in the transcript 1:03:43 and it automatically edits the audio file 1:03:47 or the video file to make that edit for me. 1:03:50 And I found it really handy for, 1:03:53 I use it as like my main transcription tool. 1:03:55 So like whenever I release a short form video 1:03:58 on YouTube shorts or Instagram reels or a place like that, 1:04:02 I always record the video, pull it into Descript, 1:04:04 get the transcription and then use that transcription 1:04:06 for the subtitles on the video. 1:04:07 There might be better ways to do it. 1:04:09 In fact, I’m sure there are better ways to do it. 1:04:11 But for me, it’s just like, 1:04:12 it’s become a really, really simple way 1:04:14 to transcribe every video that I make 1:04:16 and then make captions on that video. 1:04:18 And it uses AI to do that all. 1:04:22 Is it like something that’s life changing? 1:04:24 I don’t know. 1:04:24 I think it’s really, really helpful 1:04:27 if you make a lot of video or audio content. 1:04:30 But I’d probably put it like middle of the pack. 1:04:33 – You can edit the video with it as well or no? 1:04:37 Okay, but you just don’t use that feature. 1:04:38 – I don’t use that feature, no, no. 1:04:41 I mean, ’cause I have editors that I hire. 1:04:42 – Yeah. 1:04:43 – And I use a tool called TimeBolt, 1:04:46 which isn’t technically an AI tool, 1:04:50 to do a lot of the cutting out the gaps in the audio. 1:04:52 But Descript will do that as well. 1:04:54 To cut out us and ums, I also use TimeBolt for that, 1:04:56 but Descript will do that as well. 1:04:59 I use Descript a lot for just getting transcriptions. 1:05:01 Another thing, Descript was like, 1:05:03 even before 11 Labs came out, 1:05:05 Descript was the first tool that I ever saw 1:05:07 that can clone your voice. 1:05:12 So if you had a podcast and you misspoke during the podcast, 1:05:15 and you wanted to go and change one sentence, 1:05:18 it actually, you can train it on your voice, 1:05:20 go and type in the new sentence you wanted it to say, 1:05:22 and it would replace the original sentence 1:05:26 in the audio version of that with you saying it. 1:05:29 Now 11 Labs does it so much better. 1:05:32 The Descript version is still fairly mechanical AI sounding 1:05:35 when you do it, but it was the first tool I ever remember 1:05:38 seeing that had like voice cloning built into it. 1:05:40 – Yeah, it’s, I guess, I can’t argue with that. 1:05:44 I guess, see, it like the editing sounds way more interesting 1:05:45 to make it like the transcript thing. 1:05:47 It seems like that’ll just be like, 1:05:50 so you’ll be able to do it with like anything in the future. 1:05:51 Like I can imagine even YouTube 1:05:52 will probably just have all that. 1:05:54 – Well, YouTube does have a transcription built in too, 1:05:56 but it’s like, you got to upload the video first, 1:05:58 and I need the transcription before the video’s uploaded. 1:06:00 – Right, got it. 1:06:02 – But yeah, I mean, I think like transcription 1:06:04 is going to be very commoditized. 1:06:06 You’ve got the Whisper API from OpenAI. 1:06:07 It’s really cheap to use. 1:06:10 So it’s really easy for tools to just build in transcription. 1:06:13 – But it’s somewhat useful for you today. 1:06:13 So like seed makes. 1:06:15 – It’s somewhat useful for me today, 1:06:17 but I also think like, if I was, 1:06:19 if I didn’t already have like workflows in place 1:06:22 and systems in place for creating and editing my videos, 1:06:23 I would probably use Descript a lot more 1:06:25 for editing the videos. 1:06:27 It’s just like, I haven’t really wanted to change 1:06:28 the workflows that I have. 1:06:30 So I think anybody that’s trying to get into podcasting 1:06:31 and they’re going to edit their own stuff 1:06:34 or anybody that’s trying to get into like creating video 1:06:36 content and they want to edit their own stuff, 1:06:38 I really think Descript is worthwhile 1:06:40 for like speeding up the editing process as well. 1:06:41 All right, we got three more. 1:06:44 Now we’re getting into like the audio apps. 1:06:46 Yeah, so we got Soono first. 1:06:48 I love Soono. 1:06:50 Like Soono is so much fun to me. 1:06:51 – Yeah. 1:06:53 – I can’t put it in last year though. 1:06:55 It’s too much of a novelty. 1:06:56 – Yeah, yeah, I love it. 1:06:58 I’ve only used it twice. 1:07:01 So I think you use it more than me, right? 1:07:04 – I’ve made hundreds of songs with it. 1:07:05 – Oh, that’s awesome. 1:07:07 Yeah, and you like, you play music or you used to, right? 1:07:08 So that makes sense. 1:07:10 – Yeah, yeah, I’m a musician. 1:07:12 So I’ll use songs that I make in Soono 1:07:13 instead of YouTube videos. 1:07:15 Like if you want background music 1:07:16 and you don’t want to worry about copyright, 1:07:20 you can make Soono make you a song without any lyrics 1:07:21 and use that as your background music. 1:07:24 So you can say, I need some like calm emotional music 1:07:25 for this part of the video 1:07:27 and then I need upbeat, exciting music 1:07:28 for this part of the video. 1:07:30 And you can tell it to not generate lyrics 1:07:32 and it will generate like a sound bed 1:07:33 for you for your videos. 1:07:35 I’ve also made like montages 1:07:36 where I have like a whole bunch of clips 1:07:38 from an event that I went to. 1:07:40 And then I actually typed in the lyrics that I wanted. 1:07:44 Like, oh, now I’m looking at, you know, 1:07:46 now I’m looking at the Insta360 cameras 1:07:47 on the wall over here. 1:07:48 Those are pretty cool. 1:07:51 Look over here, it’s a really cool futuristic car 1:07:53 and look at that, it’s a flying car. 1:07:55 And like I would put all these like random things 1:07:58 that I was like, basically what I’m seeing in the video, 1:08:00 I would put that as the song lyrics, 1:08:04 make a song and then make the montage video 1:08:07 like sync up to the song where the lyrics of the song 1:08:09 are actually what I was looking at in the video. 1:08:10 And you can do like really fun stuff like that 1:08:13 that makes for really engaging videos. 1:08:15 So it’s fun and I use it. 1:08:17 And most of the time I’m using it, 1:08:20 I’m using it just to screw around and just like play with it. 1:08:21 – It’s so much fun. 1:08:26 I used it to make a song to my wife when she was my fiance. 1:08:29 And it was just like a love song to her. 1:08:31 And like, I even like, I wrote in like our like story 1:08:32 of like meeting in Kyoto and stuff 1:08:34 and it made this awesome song, you know? 1:08:35 – Yeah, yeah. 1:08:38 – So it seems like it’s like, it’s amazing technology. 1:08:39 It’s a novelty. 1:08:41 It’s kind of like, it’s just kind of fun to use, 1:08:43 but you do get some real use out of it. 1:08:45 So it seems like that’s probably like what, 1:08:47 A or B, something like that? 1:08:49 – Yeah, I’d probably have to put it in a B, 1:08:52 just like going back to like how often I use it 1:08:55 for like actionable practical use cases. 1:08:56 – Yeah. 1:08:57 – And not very often. 1:08:59 I’m using it for fun and screwing around more often 1:09:00 than anything. 1:09:03 Would I put it on the same par with Claude? 1:09:04 That would be tough, right? 1:09:06 If we’re saying Claude’s an A, 1:09:08 like I don’t think I can put Sudo at the same- 1:09:09 – Yeah, B makes sense. 1:09:10 – As Claude. 1:09:14 All right, and then you got 11 labs. 1:09:16 So anybody who doesn’t know what 11 labs is, 1:09:17 it’s a voice cloning tool. 1:09:18 You can train it on your own voice. 1:09:19 You can train it on anybody’s voice. 1:09:21 You type in words and it will speak out 1:09:23 in a very realistic tone. 1:09:25 It’s pretty valuable. 1:09:27 And they keep on adding like features, right? 1:09:29 Like they’ve got the 11 labs reader now 1:09:31 where you can load in blog posts or PDFs 1:09:33 and it will like read it in your voice 1:09:35 or it’ll read it in whatever voice you choose. 1:09:36 And so you could go out for a walk 1:09:40 and turn like a PDF or a news article 1:09:42 or something like that into almost like a podcast 1:09:44 that you listen to, but it sounds like a real human 1:09:47 and not like an AI robot talking to you. 1:09:48 So that’s, you know, that’s useful. 1:09:50 It also creates sound effects now too. 1:09:52 So you can say like, I need a clang sound effect 1:09:55 or I need like a dog barking sound effect 1:09:58 or a car crash sound effect or whatever, right? 1:10:00 And it will create that sound effect for you. 1:10:02 So it’ll create talking audio. 1:10:03 It’ll create sound effect audio. 1:10:07 It can be used to like make audio from text files. 1:10:10 Like there’s a lot that you can do with it. 1:10:11 – Were they the ones too 1:10:12 who created the whole persona thing? 1:10:14 Or was that Suno? 1:10:15 I feel like there was one where 1:10:17 that you can like create like a personality now 1:10:19 and like you kind of like save that personality. 1:10:19 – That’s Suno. 1:10:21 Suno just created the Personas feature. 1:10:23 – Oh, yeah, I was getting confused. 1:10:23 Okay. 1:10:25 I was like, oh, that’s great. 1:10:27 – With 11 labs, you can train your own voice into it. 1:10:29 And then they even have a marketplace 1:10:31 where you can sell access to your voice if you want. 1:10:32 In fact, Matt Vid Pro, 1:10:34 he actually put his voice in the marketplace 1:10:37 and then started seeing ads online 1:10:39 with his own voice promoting a product. 1:10:42 And he’s like, I should take my voice off the marketplace. 1:10:43 – Right, it’s probably worth more money 1:10:44 than that too, like longterm. 1:10:46 But yeah, especially if it’s like promoting the wrong thing. 1:10:47 Right? 1:10:48 – Yeah, yeah, yeah. 1:10:50 – Here are these pills that help you, you know. 1:10:51 – I’m Matt Vid Pro. 1:10:53 Here’s why you should buy blue chew. 1:10:53 Yeah. 1:10:57 – Rhino pills or whatever. 1:10:57 – Yeah, yeah. 1:11:02 Here’s my gut feeling, and we can debate it a little bit. 1:11:04 For me, 11 labs is eight here. 1:11:05 – I was gonna say A is what I was gonna say, 1:11:07 ’cause like it’s actually being used by people. 1:11:09 Like there’s people in certain industries 1:11:11 are actually using it right now. 1:11:12 – Yes, like a lot, a lot. 1:11:15 Like people are using it for doing voiceovers 1:11:16 on like sales videos. 1:11:19 People are using it for the faceless channels. 1:11:21 Like it’s really, really popular 1:11:23 among like the faceless YouTubers, right? 1:11:25 They just put out content. 1:11:28 You can create voices that never existed before. 1:11:30 You could say I want it to sound like a, you know, 1:11:33 an 80 year old grandpa who, you know, 1:11:35 stepped on attack or whatever. 1:11:38 Like it’s just like random prompt. 1:11:39 – Do we wanna reorder these at all? 1:11:41 Like do we wanna say like, oh, it should be ahead of runway 1:11:43 ’cause people are actually using it? 1:11:44 Or does that not matter? 1:11:46 Do we not, don’t worry about it. 1:11:48 – Yeah, maybe it does go ahead of runway. 1:11:51 And also when it comes to like voice AI, 1:11:54 so I actually interviewed one of the founders of 11 labs 1:11:58 for like an NVIDIA panel that we did a while ago. 1:12:02 And when it comes to like the voice, the speech AI, 1:12:04 like 11 labs is leading the way, right? 1:12:06 Like there’s a lot of other companies out there 1:12:10 that are like voice clone, like AI voice tools. 1:12:13 But like 11 labs is the chat GPT of voice tools, right? 1:12:15 It’s the one that everybody’s trying to catch up with. 1:12:17 It’s the one that everybody says 1:12:19 we’re almost as good as 11 labs, right? 1:12:24 Like they’re the one everybody’s, they set the bar, right? 1:12:27 – Right, I mean, the fact that so many people 1:12:29 are using it in facelift channels, that’s like awesome. 1:12:31 That’s like an AI tool that’s actually being used 1:12:33 that changed an industry, right? 1:12:35 So definitely makes it say. 1:12:37 – And then the last one, 1:12:38 I’m not sure if this is the proper icon or not, 1:12:41 but it’ll have to do the trick for this one is a notebook LM, 1:12:43 which as we talked about at the beginning of this 1:12:45 uses Gemini underneath, 1:12:48 but we’re more specifically talking about like the feature 1:12:51 where you can upload a whole bunch of documents 1:12:55 or YouTube transcripts or links to articles. 1:12:57 And then you can ask the questions 1:12:59 and like actually have a chat with all those documents 1:13:03 and it will also create like an audio narrated podcast 1:13:05 where it actually sounds like two hosts 1:13:07 talking back and forth about the topic, 1:13:10 which I think is really cool as well. 1:13:11 – I have mixed feelings. 1:13:12 So like– – I know, this one’s gonna be tough, 1:13:13 I think. 1:13:16 – I think it’s an awesome technology. 1:13:17 It’s a great product. 1:13:20 I can see so much potential for it, 1:13:22 especially like in like learning and things like that. 1:13:24 Like, you know, you want to learn a subject, 1:13:26 you can make a little podcast and you can listen to it. 1:13:27 That’s awesome. 1:13:28 It’s entertaining. 1:13:29 My son was like, 1:13:31 I don’t think I’ve ever seen him laugh as much 1:13:33 as like when we created a podcast talking about, 1:13:36 oh, by the way, the earth has just been invaded 1:13:39 and like, you know, one of you is actually probably an alien. 1:13:42 My son just like thought this was so hilarious 1:13:44 listening to them, kind of talk that out. 1:13:46 Like, by the way, maybe it’s you, you know, kind of thing 1:13:48 or like, this is weird. 1:13:50 I’m just really skeptical about Google’s ability 1:13:52 to make it mainstream or like, 1:13:55 to release new products in general 1:13:56 and have them become popular. 1:13:59 I’m skeptical of that because I haven’t seen it. 1:14:01 So, I don’t know, I have a, 1:14:03 it’s probably A or B though, I guess. 1:14:04 Some, something like that. 1:14:05 – Yeah. 1:14:07 – It’s one of the most amazing like things 1:14:09 I’ve seen in the last, in the last six months. 1:14:10 – Definitely. 1:14:11 I think it’s the most impressive thing 1:14:14 that has come out of like Google in a while, right? 1:14:15 – Yeah. 1:14:18 – Like, Gemini’s always felt like it was playing catch up 1:14:20 with chat GPT, right? 1:14:22 It has the bigger context window, 1:14:23 but other than that, it kind of felt 1:14:25 like it’s playing catch up. 1:14:27 But then when notebook LM came out, 1:14:29 it was like using Gemini underneath 1:14:31 and everybody went, this is like, 1:14:32 this is a cool use case. 1:14:33 This is a really, really cool way 1:14:36 to like consume information. 1:14:38 And I’m using it quite a bit, right? 1:14:40 Like, I’ll pull in a whole bunch of like news articles 1:14:43 to try to understand what’s going on with something 1:14:45 and have them explain it to me. 1:14:46 Dang that. 1:14:49 I do know it has some issues. 1:14:52 Like, it will get confused about things sometimes 1:14:55 where if you give it like two different news articles, 1:14:58 it will say like, I’m having a hard time 1:15:00 thinking of an example, 1:15:03 but it will like say something happened 1:15:06 in one of the news articles that didn’t actually happen 1:15:08 because it got confused with the wording 1:15:10 between the two news articles. 1:15:12 So sometimes I know it can get confused 1:15:15 and the speakers on the podcast 1:15:17 actually give the wrong information 1:15:19 because of its confusion 1:15:22 between all the documents you uploaded. 1:15:23 Yeah, I do wonder. 1:15:26 There was some recent research. 1:15:27 I think it came from, what’s his name? 1:15:30 That guy, Ethan, a professor at Wharton. 1:15:31 I think he shared it on X. 1:15:33 I think he shared it showing that the more, 1:15:35 that there’s new data out, 1:15:37 showing that the more restricted a model is, 1:15:39 like the more restrictions they put on it 1:15:41 in terms of being more censored and things like that, 1:15:44 the less intelligent the model is, 1:15:46 because you had all these different layers of complexity 1:15:49 there that just makes it less intelligent. 1:15:50 So I do wonder if that’s like one of the problems 1:15:52 they’re having, like Google’s having behind the scenes 1:15:55 in general, like just because of the culture of Google 1:15:58 or whatever is that it’s all more restricted. 1:16:00 So it just, it’s ended up being less intelligent, 1:16:01 even though it’s got great technology 1:16:04 and a huge context window and all that. 1:16:06 And I do wonder, like it feels like, you know, 1:16:07 the last great product that Google released 1:16:11 that I can think of is like Gmail, right? 1:16:14 And then you got Google Analytics, 1:16:16 which they killed, basically. 1:16:18 – Yeah, yeah. 1:16:20 Are you talking about like products that Google built? 1:16:23 ‘Cause I mean, they’ve acquired some good companies. 1:16:25 – Oh, YouTube, yeah, of course they acquired, yeah. 1:16:27 – I mean, YouTube, you’ve got Google, you know, 1:16:30 the whole Google Drive, Google Docs, Google Sheets, 1:16:31 all that like whole suite of tools. 1:16:33 I use those constantly. 1:16:35 – Man, Jim and I just feels like Google Plus to me. 1:16:37 And so like I just feel like, you know, 1:16:38 where they just copied another company 1:16:41 and no book L.M. is not that it’s like it’s unique. 1:16:44 So I wanted to like get a decent ranking. 1:16:46 I just feel like they, it’s not there yet. 1:16:48 Like it’s like has so much potential 1:16:49 and they’ve shown the potential. 1:16:51 And I’m not sure they’re gonna be the one 1:16:54 to actually capture the value from that potential. 1:16:56 – Yeah, I also ran into some issues 1:16:58 when I was trying to create some content. 1:16:59 Like they just added a new feature 1:17:00 that I was really excited about 1:17:01 where you can give some additional context 1:17:04 and sort of steer the conversation, right? 1:17:06 Like there’s a new like customize button. 1:17:08 And I pressed the customize button and I said, 1:17:11 hey, you’re a podcast called, 1:17:12 I don’t remember where I called it, 1:17:15 but you know, you’re a podcast called the next wave. 1:17:17 When your episode starts, say, 1:17:19 hey, welcome to the next wave 1:17:21 and then move into the rest of the conversation, right? 1:17:23 That wasn’t what I was calling it. 1:17:25 That’s just for example purposes. 1:17:27 And it ignored my instructions completely, right? 1:17:29 I also gave it instructions one time 1:17:30 where I gave it like transcripts 1:17:32 from like three different videos 1:17:35 that were interviews with Casey Neistat, right? 1:17:37 And I said, in the custom instructions, 1:17:40 pull out what Casey Neistat does different 1:17:41 than other YouTubers 1:17:44 and make a podcast about what makes him different 1:17:46 and you know, so much more successful 1:17:48 than so many other YouTubers. 1:17:50 Totally ignored my instructions 1:17:52 and just like talked about Casey 1:17:53 without specifically honing in 1:17:55 on the things I asked it to hone in on. 1:17:57 I almost felt like the customize button 1:18:01 that they put into notebook LM is pointless. 1:18:03 Like right now, like for me, I couldn’t get it to work. 1:18:06 It like, it didn’t follow my extra instructions. 1:18:09 So that was interesting to me as well. 1:18:10 – Yeah, it does make me wonder 1:18:13 that that’s like possibly a bad product decision. 1:18:15 So that, and again, that’s like my criticism 1:18:16 of Google in general. 1:18:18 So like, maybe this is like some great research 1:18:20 that turned into like a really simple product 1:18:22 but can they take it to anything beyond that? 1:18:25 – Yeah, well like it was originally Project Tailwind, right? 1:18:28 And Project Tailwind was designed to like 1:18:30 allow you to chat with all of your documents 1:18:32 that you put inside of like a specific folder 1:18:33 inside of Google Drive or whatever, right? 1:18:35 And that’s what this became. 1:18:37 The podcast thing was, I think, 1:18:39 just like an extra fun little like novelty feature 1:18:40 that they’re like, oh, this is kind of cool. 1:18:42 Let’s put that on top of it. 1:18:43 And it turned out to be the feature 1:18:45 that people like the most from it. 1:18:47 And so I think that’s kind of how that played out. 1:18:48 – Yeah, so in terms of like, 1:18:50 it’s like one of the most impressive things 1:18:51 I’ve seen the last six months, 1:18:53 there was like part of me that wanted to put it like as a, 1:18:55 but I do feel like maybe it’s B or C 1:18:56 just because of, I just don’t know 1:18:57 what they’re gonna do with it. 1:18:58 Like, I don’t know if it’ll actually 1:18:59 become a big thing or not. 1:19:01 Like it, it feels like maybe it’s already like, 1:19:03 it’s got that novelty, it’s been cool. 1:19:04 Like especially some of the videos 1:19:06 where like they were surprised where people told it like, 1:19:09 oh, you are AI by the way and it actually freaked out about it. 1:19:11 Like that was an amazing viral clip, 1:19:14 but what they do with it beyond that, I’m not sure. 1:19:15 Like I’m not sure if they’ll actually improve it 1:19:16 and make it a big. 1:19:19 – Yeah, I’m leaning towards B, I think. 1:19:21 I have a hard time putting an A tier. 1:19:23 Like it’s not on the same level as Claude 11 lives 1:19:24 and runway for me. 1:19:27 – Yeah, C seems to mean cause it is a pretty cool. 1:19:28 – Yeah, like I’m definitely using it 1:19:31 more than Grock or Replit or, you know, 1:19:32 any of the tools that we’ve got down there. 1:19:35 So I don’t know, B feels right to me. 1:19:37 B would have potential for S. 1:19:39 So like if they actually like make it better 1:19:41 and like do a good job with the product, 1:19:43 they should like, now it’s kind of hidden on the side. 1:19:44 It’s like so hard to even find. 1:19:46 Like they need to like launch it 1:19:49 as like a standalone product 1:19:51 and like really nail some kind of use case, 1:19:53 maybe education or something like this. 1:19:55 Like you’ve got a knowledge base, hit your company, 1:19:56 now make some stuff that explains stuff 1:19:58 about your company to people, something. 1:20:00 They need to nail some kind of use case 1:20:01 and launch a product. 1:20:02 – Here’s what I can see. 1:20:04 And I’m just sort of shooting from the hip here, 1:20:07 but like a combination of what perplexity does 1:20:09 and what notebook LM does, 1:20:13 but like in like a conversational format, right? 1:20:15 Like let’s say I want to go for a walk 1:20:17 with my dog or something. 1:20:19 It would be really cool is to open up a tool 1:20:22 that’s like this notebook LM perplexity hybrid 1:20:25 and just say, hey, today I want to learn about quantum physics. 1:20:27 Give me a podcast about quantum physics. 1:20:30 It goes and does the like sort of perplexity thing 1:20:33 of hunting down all the articles and information you need 1:20:36 and then pulls them into the notebook LM side of things 1:20:38 and turns it into that podcast. 1:20:39 So now all I’m doing is giving it an audio command 1:20:42 of like, here’s what I want to learn about. 1:20:43 And then it gives me a whole podcast 1:20:45 after doing the research for me. 1:20:46 – Yeah. 1:20:48 – That seems like it should exist. 1:20:49 – But that’s why perplexity is S tier 1:20:51 ’cause they’re more likely to do that than Google is. 1:20:53 – Yeah, I agree, I agree. 1:20:54 (laughing) 1:20:54 And you know, all right. 1:20:56 So this is like, that’s all the tools we came up with. 1:20:58 I know there’s tools that we’re missing. 1:21:01 I know like there’s probably a lot of people that disagree. 1:21:03 Like I’m actually looking forward to seeing the comments 1:21:05 on this episode ’cause I want to see people go, 1:21:08 I can’t believe we put that in D tier or whatever. 1:21:10 I think that’s gonna be really fun to see and debate. 1:21:12 But at the end of the day, 1:21:15 I think every single tool that we put on this list 1:21:17 has the potential to jump to S tier at some point. 1:21:20 Also has the potential to fall to F tier at some point. 1:21:22 I think, you know, like this isn’t anything personal 1:21:23 about the companies. 1:21:25 Like I actually, for the most part, 1:21:28 like all of these companies and have messed with the tools 1:21:30 and think what they’re doing is really, really cool. 1:21:31 So there’s like, you know, 1:21:34 I’m trying to save face here a little bit and be like, 1:21:36 “Hey, don’t hate us to put a D in this tier.” 1:21:39 But, you know, I do think all of these tools 1:21:41 have the ability to like move up and down. 1:21:43 This is sort of more of a ranking of like 1:21:45 how useful they are to us in our lives, 1:21:47 which is obviously very subjective right now. 1:21:52 So, you know, just want to kind of disclaim that 1:21:53 before we wrap this one up. 1:21:55 But I don’t know, this was kind of fun. 1:21:57 I think this is gonna be our longest episode 1:21:59 we’ve ever put out, but it was like really fun 1:22:02 to sort of like rip and talk about all these tools. 1:22:03 – Yeah, it was more fun than I expected. 1:22:04 I think if people like it, 1:22:06 I think we should like go into like different categories 1:22:08 ’cause obviously with the AI video, 1:22:10 there was like two other companies we could have listed 1:22:12 and like with AIR, there was one or two others. 1:22:14 We could do like a different categories 1:22:16 and like actually spend time to like 1:22:18 make sure we’ve tried the latest version of them too 1:22:19 ’cause some of these, you know, 1:22:20 haven’t tried them recently. 1:22:21 – For sure. 1:22:22 Well, cool. 1:22:24 I think on that note, 1:22:26 we should probably go ahead and wrap this one up. 1:22:28 Thanks everybody for tuning in. 1:22:30 If you enjoy this type of content 1:22:33 and you wanna learn about the latest AI news and tools 1:22:35 and keep your finger on the pulse 1:22:37 and learn actionable strategies to use AI, 1:22:39 make sure you subscribe to us on YouTube 1:22:42 or follow us wherever you follow podcasts. 1:22:44 And thank you so much for tuning in. 1:22:45 Really, really appreciate you 1:22:48 and hopefully we’ll see you in the next episode. 1:22:49 – Thank you all. 1:22:51 (upbeat music) 1:22:54 (upbeat music) 1:22:56 (upbeat music) 1:22:59 (upbeat music) 1:23:02 (upbeat music) 1:23:04 (upbeat music) 1:23:06 you
Episode 33: Can Google’s AI tools take the top spot in the AI industry? Matt Wolfe (https://x.com/mreflow) and Nathan Lands (https://x.com/NathanLands) debate Google’s innovation potential and compare it with leading AI tools on the market.
This episode dives into the evolution of “Project Tailwind” from Google’s document-chatting feature to a podcast powerhouse, its ranking in the AI tool tier list, and what it means for the competition. They discuss the potential of Perplexity, Eleven Labs, Runway, Luma, Descript, and more. The importance of user experience, effective application of AI tools, and future improvements are central themes in this comprehensive tier ranking.
Check out The Next Wave YouTube Channel if you want to see Matt and Nathan on screen: https://lnk.to/thenextwavepd
The Next Wave is a HubSpot Original Podcast // Brought to you by The HubSpot Podcast Network // Production by Darren Clarke // Editing by Ezra Bakker Trupiano
Build An App with a Backend Using Ai in 20 min (Cursor Ai, Replit, Firebase, Wispr Flow)
AI transcript
0:00:07 We’re going to be using Cursor Composer, which is the greatest feature, I think, in the history of the world, and I’m not exaggerating. 0:00:14 You can actually make really good-looking, really useful software that actually works, that you can actually give other people access to. 0:00:21 You could just tell it to design, and I’m not a designer. Like, this looks good, too. Like, this is like a legit… I might make this into a site. 0:00:23 It’s wild to see you doing this right. 0:00:33 Hey, welcome to the Next Way Podcast. I’m Matt Wolf. I’m here with Nathan Lanz, and today we’ve got an absolutely mind-blowing episode for you. 0:00:39 We’re going to build some code on this episode from scratch. We don’t even have an idea for a product yet. 0:00:46 We’re going to come up with the idea for the product, build the product, and then actually demo using the product all in the course of this episode. 0:00:59 And today we brought on a special guest to help us do that. We brought on Riley Brown, who has absolutely exploded on TikTok and Instagram and all the social media platforms with his training on how to code using AI. 0:01:07 So he’s going to break down his methods for us, and he’s even going to give us his templates so we’re not starting from scratch when building these apps. 0:01:16 So if you want the templates that he’s already done most of the hard work for you, those will be linked up in the description, wherever you’re watching or listening to this video. 0:01:23 But it is a really, really impressive process that we can now create something that used to take two weeks and do it in a matter of 20 or 30 minutes. 0:01:28 And we’re going to show you that we can do that right on this episode with Riley Brown. So let’s dig in. 0:01:35 When all your marketing team does is put out fires, they burn out fast. 0:01:42 Sifting through leads, creating content for infinite channels, endlessly searching for disparate performance KPIs, it all takes a toll. 0:01:50 But with HubSpot, you can stop team burnout in its tracks. Plus, your team can achieve their best results without breaking a sweat. 0:02:01 With HubSpot’s collection of AI tools, Breeze, you can pinpoint the best leads possible, capture prospects attention with clickworthy content, and access all your company’s data in one place. 0:02:06 No sifting through tabs necessary. It’s all waiting for your team in HubSpot. 0:02:13 Keep your marketers cool and make your campaign results hotter than ever. Visit hubspot.com/marketers to learn more. 0:02:27 Today’s guest is Riley Brown. And Riley didn’t know how to write any code, went down the rabbit hole of developing software using AI. 0:02:35 And today he’s going to break down some of those methods with us on screen and we’re going to build an app completely from scratch by the end of this episode. 0:02:37 So welcome to the show Riley. 0:02:42 Super, super hypes. Yeah, I love building software with AI. 0:02:45 Super fun because you don’t even have to know how to code and you’re right. 0:02:53 I have learned a little bit about coding, more so like at the higher level, like I know where code goes. 0:02:56 I know what’s kind of required for certain projects. 0:03:02 But like once you get into like the actual line, I don’t know too much, which is fun because Claude is very good at that. 0:03:04 So that’s fun. 0:03:08 Before we get into writing code, I just want to sort of set the stage for people a little bit. 0:03:14 I think I first came across your content either on Instagram Reels or TikTok. 0:03:21 I don’t remember exactly, but where’s the platform you kind of spend the most time on? 0:03:24 How did you get into AI in the first place on that platform? 0:03:34 Yeah, I only consume on X in YouTube, usually through audio, but like I don’t consume on Instagram and TikTok, 0:03:36 but like I have the most followers on Instagram and TikTok. 0:03:39 I just like learn, I love X with a passion. 0:03:42 I think it is the place, it is the best place to learn. 0:03:48 It is the best place to find those like weird, super smart people who are tinkering with stuff 0:03:53 that no one else would on Instagram or TikTok because they wouldn’t get any views. 0:03:57 And so it’s more of just like people just doing stuff for the love of the game 0:04:01 versus like repackaging the stuff that happens on X and like putting it in a clean format, 0:04:03 which I’m actually good at. 0:04:06 It’s just not as fun for me from a consumer standpoint. 0:04:12 So if anyone were to follow me on social media, I would prefer X because like I just love that platform. 0:04:13 That’s me. 0:04:19 All the other platforms, the short form platforms are kind of like a character where I have to be viral. 0:04:20 On Twitter, I can be myself. 0:04:21 So that’s fun. 0:04:22 For sure. 0:04:25 So how did you end up going down this whole coding rabbit hole? 0:04:29 Because I know, you know, a lot of the videos I was seeing were, you know, 0:04:33 you playing with a lot of the image editors, video editors, tools like that. 0:04:37 And then one day you’re like, this just blew my mind and you started showing off code. 0:04:43 And then from then on out, almost everything I’ve seen from you is that overlap of AI and building software. 0:04:45 Like where did, where did that come from? 0:04:45 Yeah. 0:04:47 So I had, you’re right. 0:04:53 I had always, I’ve always loved AI since I tried mid journey in the middle of 2022. 0:04:54 I was like hooked. 0:04:56 I was like, this is so cool. 0:04:58 I didn’t fully understand it. 0:05:02 I just dove all the way in and I started using like runway. 0:05:06 I was one of the first people to use DID and make a viral video about that. 0:05:12 And I just like started making a ton of videos and I really liked it and I thought I loved it. 0:05:22 But it wasn’t until I started using Claude artifacts, which allows you to like say create a react type script component. 0:05:25 And it renders the front end in real time. 0:05:27 And I was like, oh my God, this is actually really cool. 0:05:31 I’ve always, I’ve used software and I’ve done a lot of stuff with AI. 0:05:37 Like there’s like no code tools you could set up, but like they weren’t super useful for me. 0:05:44 And then when I like saw the fact that I could just like have it generate code, not have to know where to put it. 0:05:45 It just renders the front end. 0:05:47 I was like, okay, this is really cool. 0:05:49 Then, you know Amar Reshi? 0:05:50 Yeah. 0:05:50 Yeah. 0:05:58 So he made a video where he made something on Claude artifacts and then he just took all the code, 0:06:03 copied it and pasted it into a repl on replete, which is basically like for those who don’t know, 0:06:08 replete allows you to like very easily deploy code in like this web environment, 0:06:09 which we’re actually going to use today. 0:06:12 He just pasted it in and hit run. 0:06:14 And then he was running it live. 0:06:16 And then he added like a few lines of code. 0:06:19 And then now he was talking to it with 11 labs. 0:06:23 And so from there, I was like, I need to know what the hell just happened. 0:06:29 And from there, I literally spent the next like three or four days asking Claude how to do it. 0:06:33 And I literally built this note taking app, which is my biggest YouTube video. 0:06:37 Maybe one of my biggest YouTube videos with like a quarter million views or something. 0:06:43 And it was me making this note taking app basically just asking Claude how to do it. 0:06:48 I set up a database with Google Firebase, which I had never heard of Claude. 0:06:52 Just I literally was like screenshotting the app, pasting it and be like, 0:06:53 all right, I’m at this screen. 0:06:53 What do I do? 0:06:56 And they’re like, all right, do this, do this, do this, like four hours of failing. 0:06:58 And then I created this note taking app. 0:07:02 And then I added AI features all by talking and this was before cursor. 0:07:08 I was literally using Claude and copying and pasting the code from Claude into replete. 0:07:11 And then once I found cursor, it was just game over because you could just like it 0:07:15 was the most addicting thing I could have found. 0:07:18 And yeah, I’m very obsessive. 0:07:22 And so when I find something like that where I’m like, I need to learn about it, I do. 0:07:23 And here we are. 0:07:24 Yeah. 0:07:26 No, I found Claude to be very, very. 0:07:30 That was the first time I really started playing around with code. 0:07:35 I actually coded up like a really, really basic side scroller game using chat GPT 0:07:38 like a year and a half ago or something like that. 0:07:41 And it took me five, six hours. 0:07:43 I mean, the video I put out on it’s like 20 minutes. 0:07:44 But I saw that. 0:07:49 That was like six hours of recording to get it because it was literally like, 0:07:51 all right, this error popped up. 0:07:52 How do I fix it? 0:07:53 It would give me new code. 0:07:54 I’d copy and paste it. 0:07:55 A new error would pop up. 0:07:57 Having no idea how to code whatsoever. 0:07:59 I couldn’t even begin to troubleshoot it. 0:08:04 All I can do was paste the error back to chat GPT and say, what’s wrong? 0:08:04 What’s wrong? 0:08:06 I just keep doing that over and over again. 0:08:09 And I ended up doing that for hours and hours and hours until I finally got a 0:08:14 game and then that same exact game concept when Claude artifacts came out. 0:08:17 I was able to do it in like two prompts, you know? 0:08:20 So it really changed the game when artifacts came out. 0:08:25 No, that is a great way to learn this tech is to just like paste the errors back in 0:08:29 and then you always want to take a session every once in a while to like ask me. 0:08:34 Like, okay, what the errors that I ran into, like, why did I run into those errors? 0:08:38 And actually just over time, you’re going to make less and less of those errors. 0:08:44 And then a two hour project becomes 15 minutes if you and so that’s the point 0:08:46 where I’m getting with simple web apps. 0:08:48 But like, obviously, there’s just so much to learn. 0:08:50 So yeah, it keeps me busy. 0:08:53 Yeah, I’m curious to see like what the state of the art is because like, 0:08:56 I first got involved in AI using GitHub co-pilot. 0:08:58 So like, I’m lightly technical. 0:09:00 I’ve done tech startups, I code a bit. 0:09:04 I wouldn’t, I’m not like an amazing engineer or anything, but like I can code. 0:09:07 And so that’s how I got started with just like seeing how amazing GitHub co-pilot was. 0:09:09 I was like, oh my God, this is going to change everything. 0:09:12 But you know, there’s so many different things going on in AI. 0:09:15 I haven’t really kept up with like, I’ve tried cursor a bit. 0:09:19 I saw how the potential of it, but I’m not, I’m definitely not an expert at using cursor 0:09:22 or using V zero or any of these. 0:09:25 So yeah, I’m excited to see how it all works together. 0:09:28 I think to go off what you said, and then we can kind of dive into this. 0:09:33 But I think it’s a good, when you’re first getting started, I would just use cursor. 0:09:39 I use replete because it’s very easy to deploy, but I would recommend starting with cursor 0:09:43 and just sticking with that until there’s like a clear narrative change 0:09:44 that there’s a better tool. 0:09:48 I like, in my opinion, like cursor, I tried GitHub co-pilot 0:09:51 and I regret it even trying because cursor is like that much better. 0:09:56 And so like if, if you actually want to get to the point where you’re creating stuff, 0:10:00 you need to go deep before you like try a bunch of tools 0:10:03 or else you’re just going to get lost in like tutorial hell. 0:10:05 And yeah, that’s, that’s kind of the way that I see it. 0:10:07 Now don’t cursor and replete work together. 0:10:12 Can’t you like write something in cursor and have it pushed to replete or something? 0:10:12 Yeah. 0:10:12 Okay. 0:10:16 So let’s just, let me talk about the template. 0:10:21 And I think this is a good intro to the, the, what we’re going to build. 0:10:26 So this is just a code base and it is set up via SSH. 0:10:27 I’m not going to lie. 0:10:32 I don’t know what SSH stands for, but it is a setting on replete that you, 0:10:36 you basically just go up to SSH and then you basically go to cursor 0:10:40 and you generate an SSH key, which basically connects the code bases. 0:10:41 So they mirror each other. 0:10:45 Any change I make in replete will mirror to cursor and vice versa. 0:10:47 And so now we have this. 0:10:52 And so when you’re beginning to code, like when you start any project, 0:10:56 the beginning is the most annoying part, which is why projects took two hours. 0:11:02 My now business partner, Ange and I basically went through and we did the plumbing. 0:11:05 That’s what he calls it where we did all the annoying dirty work 0:11:07 at the beginning to give you the base template. 0:11:10 This is the base template that before we started this episode, 0:11:15 it takes about to a new person just starting maybe 15 minutes to set up. 0:11:17 For me, it takes like two or three minutes to just like quickly go through 0:11:19 and like set it up. 0:11:23 And so this is like the base app where you basically just have simple sign in features. 0:11:26 It’s cooked up to Firebase. 0:11:27 So you can sign in. 0:11:29 So you guys could sign into this app. 0:11:33 You would have your own profile and you can it’s just like a very simple 0:11:37 like social media app just to make sure that like your database is working 0:11:42 and you can just like post and this way it gives the AI kind of context. 0:11:44 Okay, this is what you’re starting out with. 0:11:48 Now your first prompt starts out with like, okay, you’re starting out with this template. 0:11:51 Now let’s create basically whatever you want to create. 0:11:54 And so that gives us just a good starting point. 0:11:57 And we’re going to be using cursor composer, 0:12:01 which is the greatest feature I think in the history of the world. 0:12:03 And I’m not exaggerating, 0:12:06 which is this place where you get to describe what you want cursor to code. 0:12:11 And then when we hit save, we will see the changes as soon as we hit save 0:12:14 in the rebel or the the web view on replete. 0:12:16 And so yeah. 0:12:19 And so cursor just you can correct me if I’m wrong. 0:12:21 You’ve used it a lot more than I have. 0:12:26 But basically if I’m trying to write code directly inside of something like Claude, 0:12:30 it’s basically looking at like the one file I’m trying to fix every time. 0:12:30 Right. 0:12:35 Like let’s say I’m just trying to fix the index dot HTML file or whatever. 0:12:37 I can paste that code into Anthropic. 0:12:38 Tell it to make some changes for me. 0:12:41 It will fix just that code, right? 0:12:45 But what ends up happening as you build out these files is you start to get all 0:12:49 these JavaScript files and you start to get all these HTML files and CSS files 0:12:53 and all these different files to make the site work together, right? 0:12:58 And after a while, Claude struggles to pull in the context from every single 0:12:59 file that you’re building. 0:13:05 And from my understanding, cursor makes it so that all of the files that are in cursor, 0:13:07 the AI is seeing with every single update. 0:13:13 So it knows the entire code base every time it makes a tweak for you. 0:13:13 Yes. 0:13:16 Where Anthropic, it will sort of lose the thread after a little while. 0:13:20 We’ll be right back. 0:13:23 But first, I want to tell you about another great podcast you’re going to want to listen to. 0:13:27 It’s called Science of Scaling, hosted by Mark Roberj. 0:13:32 And it’s brought to you by the HubSpot Podcast Network, the audio destination 0:13:33 for business professionals. 0:13:38 Each week, host Mark Roberj, founding Chief Revenue Officer at HubSpot, 0:13:42 senior lecturer at Harvard Business School and co-founder of Stage 2 Capital, 0:13:47 sits down with the most successful sales leaders in tech to learn the secrets, strategies, 0:13:50 and tactics to scaling your company’s growth. 0:13:55 He recently did a great episode called How Do You Sol for a Siloed Marketing in Sales? 0:13:57 And I personally learned a lot from it. 0:13:59 You’re going to want to check out the podcast. 0:14:03 Listen to Science of Scaling wherever you get your podcasts. 0:14:08 Yeah, you know, Claude projects. 0:14:11 You can actually upload all of your code base. 0:14:13 I used to do this at the beginning. 0:14:17 I would upload the entire replete code base, plug it into Claude, 0:14:19 and then ask it to generate code. 0:14:22 And it will generate multiple files at once, right? 0:14:24 It will generate the full, the code. 0:14:29 But what Cursor did, and like the genius of Cursor, is the fact that they have this apply feature, 0:14:31 which is actually a separate step. 0:14:36 Not only does it generate the code, it then, it has your existing code base 0:14:38 and it has the code that Claude generates. 0:14:44 And then there’s this apply step where they use an AI model to take the instructions 0:14:47 or the new code and then apply that code to your existing code base. 0:14:52 It looks for the difference and it generates new files based on that. 0:14:55 So it’s not, it’s not doing it in one step, it’s actually two. 0:14:59 And that, I didn’t realize that was as hard of a step as, as it was, 0:15:00 because it’s not deterministic. 0:15:05 It’s like, it’s an actual train, they train their own AI model to like apply it. 0:15:10 And that’s where the magic sauce is from, from what I can tell at least. 0:15:10 Right, right. 0:15:14 And, but if you upload like all of the files initially into Claude, 0:15:20 Claude is just basically seeing the starting point as the context anytime you prompt something, right? 0:15:23 So like as stuff gets updated, 0:15:26 you either have to go and like remove the old file from the context 0:15:28 and update it with the new one, 0:15:33 or it’s going to be constantly sort of looking at the old code base, I would imagine. 0:15:35 Super annoying process. 0:15:36 That’s mostly it. 0:15:37 Yeah, super annoying. 0:15:40 Highly recommended using cursor. 0:15:42 Let’s, let’s get into it. 0:15:45 I think, I think we need to like come up with an idea first. 0:15:50 And the day that we’re recording this is actually the day that the new advanced voice mode 0:15:53 just got released on the desktop app, right? 0:15:57 So, well, at least it got released at the Windows desktop app. 0:15:59 It might have already been on the Mac app for a little while, 0:16:02 but today is the day that the voice app on Windows for sure. 0:16:04 Hey, hey, chat GPT. 0:16:09 I, I, hey, hey, hey, listen to me, listen to me, listen to me, listen to me. 0:16:12 I am making an app right now. 0:16:15 And I need an idea for a simple web app that I can create. 0:16:17 We’re on a podcast. 0:16:19 We’re with Matt Wolf, Nathan Lanz. 0:16:24 And we need to create just a simple web app that has data. 0:16:27 We want to make sure that users can add data. 0:16:30 Maybe it’s simple text data and maybe they can upload images, 0:16:33 but we need some sort of idea that we can demo. 0:16:35 And I want you to be a little creative with this. 0:16:38 Come up with, with a few ideas. 0:16:41 Okay, so it said a daily gratitude app. 0:16:42 Okay, here, let’s just, okay. 0:16:43 So I just asked chat GPT. 0:16:45 We can’t get the audio on here. 0:16:53 I’m just going to tell it to keep going, generate an idea, idea for an app. 0:16:55 Give me three options. 0:16:57 And so we’re going to let AI decide. 0:17:02 We could also just do this directly in cursor, which might even be easier. 0:17:07 Okay, so daily habits coach, instant code explainer. 0:17:08 Hmm. 0:17:15 And, ooh, that might actually, audio mood journal records daily voice entries, 0:17:19 detects emotional tone using sentiment analysis. 0:17:21 That would be hard, probably more than an hour. 0:17:23 We could use you for that. 0:17:26 Give me three more options. 0:17:30 I often like to tell AI, like give me like 10 or 20 options. 0:17:34 And then, okay, which are the three best things to produce pretty good results. 0:17:35 Yes. 0:17:40 True, true, virtual study buddy, social media detox manager. 0:17:44 These are kind of, maybe we have to use our own human genius for that. 0:17:45 Yeah. 0:17:49 For some reason, I had a random thought of like this app that used to exist in San Francisco, 0:17:55 showing where human feces was. 0:17:58 I don’t know how we’re going to pull that data necessarily. 0:18:00 Is there an API somewhere for that? 0:18:03 No, they’re literally, could just send someone to do it manually. 0:18:06 And they could walk around with their phone, take videos. 0:18:10 I say we just, we just, I think we just, okay, what can we do? 0:18:10 What can we do? 0:18:12 We could do something fun. 0:18:20 We could do like, what about a daily AI tool skill tracker where you put in, 0:18:26 you have a user and you put in the skill that you learned with the tool that you used. 0:18:32 And then you can upload like a link to a video that you created it or something like that. 0:18:37 Like you can track your skills or like like a knowledge management system kind of thing. 0:18:38 Yeah. 0:18:39 Yeah, yeah, yeah. 0:18:43 And it can, you could have like be held accountable to like learning a new skill. 0:18:45 Let’s, let’s just, let’s just try it. 0:18:45 Let’s try it. 0:18:46 Okay. 0:18:47 So I like that idea. 0:18:47 Okay. 0:18:48 Cool. 0:18:49 Okay. 0:18:50 So let’s dive in. 0:18:53 Are you setting like a goal for yourself or like for the primary skills you’re going to? 0:18:53 Yeah. 0:18:53 Yeah. 0:18:58 So let’s say, let’s say an hour a day of learning a new AI tool. 0:18:59 I think that’s manageable for people. 0:19:01 They can squeeze it in there. 0:19:03 And then at the top we can have like a favorite tools, 0:19:05 maybe a few links to tools or something like that. 0:19:07 And maybe it’s social. 0:19:12 Like you can see other people’s status and then you guys can sign in to this app once we finish. 0:19:13 So yeah, let’s do it. 0:19:15 So cool. 0:19:16 I’m going to be using an app. 0:19:17 It’s called whisper flow. 0:19:19 It’s amazing. 0:19:20 I don’t know if you guys have used it. 0:19:24 It’s like an immediate voice to text. 0:19:24 It’s amazing. 0:19:25 I’ve heard of it. 0:19:26 Yeah. 0:19:37 Okay. So I want to create an app that allows me to track my AI tool usage and skill development, 0:19:38 mostly skill development. 0:19:49 And I want to be able to every day log what tool I used and then what specific thing I created with that tool. 0:19:54 And so over time I can track how I got better at learning with AI tools. 0:19:56 And so I want to be able to sign in. 0:20:01 This is a template and we’ve already set up Google Firebase in the template. 0:20:05 So make sure to use that within your code generation. 0:20:19 And I want you to allow me to obviously sign in and create a profile automatically and log that every single day. 0:20:26 Come up with a unique way to show that information visually, like each day, the progress, maybe a month view. 0:20:33 So you can switch between the months and see which days I succeeded and which days I failed. 0:20:34 Okay. 0:20:39 And then when in doubt, type at codebase. 0:20:42 This process is a lot better if you know what files you want to change. 0:20:45 It’s faster and it’s more precise. 0:20:48 But like when you’re just getting started, just make sure to just do at codebase, 0:20:52 which means it’ll look at the entire codebase instead of the file that we’re on currently. 0:20:55 So we’re just going to run this and see what happens. 0:20:58 We could get an error, but let’s see. 0:21:04 I like that you could just sort of kind of ramble into the microphone with just the sort of random thoughts and ideas you have about the app. 0:21:07 But it just kind of collects them all for you. 0:21:08 Yes, yes. 0:21:13 That is, I operate the best with voice for sure. 0:21:18 And it makes it way more fun than just like typing it in and it’s way faster, at least for me. 0:21:23 And so you can see here that it’s making changes to one, two and three different files. 0:21:27 And we just basically wait for it to finish. 0:21:31 And you’ll notice as I was talking about earlier, it’s generating all these and watch this right here. 0:21:34 We’re going to see it apply once they’re all done generating. 0:21:41 See it says applying right there and it’s generating applying done. 0:21:44 So while we’re waiting on this, the tools that you’re using, right? 0:21:47 I just want to kind of go over them real quick. 0:21:54 So you’re using cursor as the main sort of area where the codes being written, all of your files are organized. 0:22:03 Firebase is like the database that is storing the the the records that you’re creating essentially. 0:22:05 And it also, I believe, created the login, right? 0:22:13 It created the sort of Google login to be able to log in and then Replet itself is where the sort of front end is being hosted, right? 0:22:15 So Firebase is the back end database. 0:22:20 Replet is basically storing the front end online so people can view it, right? 0:22:25 That’s it’s it’s storing all the code like you can have a back end or front end. 0:22:28 It normally you would have to host it locally. 0:22:34 And I found replets just a lot easier to deploy and also do get. 0:22:39 So it’s a lot easier to save your code because like Replet and cursor literally have the same exact files. 0:22:42 It’s just cursor right now is better at generating code. 0:22:44 It is the best in the world. 0:22:47 And so that’s why I’m using cursor. 0:22:49 If I didn’t have to use cursor, I wouldn’t. 0:22:53 But like just right now it’s the best and I like to use the best tools for things. 0:22:57 Like I want to use the best tool that makes it the easiest for me to deploy. 0:22:59 It’s really hard to deploy apps on other apps. 0:23:00 I’ve tried a lot of them. 0:23:02 And so that’s why we use cursor. 0:23:04 But yeah, you all the other things you said you nailed it. 0:23:08 When you’re talking with this this tool, what was the tool called again? 0:23:11 The one that you’re speaking with whisper flow, whisper flow. 0:23:16 It’s so you like pressing and holding a button and then when you release the button, it knows you’re done. 0:23:17 It’s good. 0:23:17 It’s really good. 0:23:20 I’ve tested all of them because I’m a huge voice. 0:23:21 I’m a huge yapper. 0:23:27 I was just curious because if you use something like the new chat GPT voice assistant and you pause for a second, 0:23:30 it’s going to interrupt you and try to talk over you for a moment. 0:23:32 But I noticed this one isn’t doing that. 0:23:36 So I was wondering if you were like holding down a button and then it knew you were done when you release, 0:23:43 which I use the old version of the voice model all the time because I like to be able to just or actually, 0:23:48 they stop letting me do it where I can pause like the old interface for chat. 0:23:49 GBT’s voice assistant was perfect. 0:23:51 Like you could just hold it down for as long as you want. 0:23:53 Perplexities is really good too. 0:23:54 Yeah, I don’t like that. 0:23:55 It interrupts me. 0:23:59 I don’t it’s good sometimes, but it’s it’s a double-edged sword in my opinion. 0:24:02 Now, I don’t know if you’re the best person to ask about this or not, 0:24:10 but out of curiosity, do you know how long what we have here would have taken traditionally like without? 0:24:10 I don’t. 0:24:11 I have no. 0:24:12 Yeah. 0:24:13 I have no idea. 0:24:17 I like people people give me a bunch of different numbers because like the really good developers 0:24:23 that I talk to who are like excited for me like all use AI and they code 10 times faster than I do now 0:24:27 because they just like go so much faster because they’re not just using composer. 0:24:28 They’re using the chat feature. 0:24:30 So they’re going in and asking AI what to change. 0:24:31 They’re just doing it right there. 0:24:33 Composer takes longer. 0:24:38 And so like when you you start to you have to watch a really good developer code with AI 0:24:42 and you realize like the added benefit and efficiency that they’re getting out of it. 0:24:43 It’s insane. 0:24:44 And it’s actually nuts. 0:24:51 If they know all the the jargon and technical terms, you’re going to be able to better steer the AI. 0:24:53 Yes, let’s do that loud. 0:24:56 I could see some some other places where you can take it, right? 0:24:56 Yeah. 0:24:59 You can have a public feed and start seeing other tools. 0:25:05 And then it might be a way for people to discover tools they’d never heard of because they’re seeing the names of tools in the public feed. 0:25:07 And I mean, yeah. 0:25:13 Yeah, yeah, yeah, yeah, like because like social media is I think it’s changing a lot. 0:25:15 It’s like becoming more community oriented. 0:25:20 And so you might want to be able to just build one of these sites for maybe like maximum 50,000 people. 0:25:23 But like at any given time only, you know, maybe 500 people might be on it. 0:25:25 That’s manageable for Firebase. 0:25:26 It shouldn’t be too expensive. 0:25:31 It’s when you start getting up to like hundreds of thousands of users when it starts getting expensive. 0:25:35 But this is pretty like this is this is doable actually. 0:25:37 I’m like, I’m definitely thinking about it. 0:25:39 Well, once you get up into the hundreds of thousands of users, 0:25:44 I mean, you could it starts to get fairly easy to figure out how to monetize, right? 0:25:48 Like you could start throwing adsense ads inside the dashboard and monetize through ads, 0:25:53 or you can charge a small fee to use it, or you can have free and paid plans. 0:25:57 So like you can migrate on Firebase as well. 0:25:59 We can go ahead and call it a tool. 0:26:03 I mean, we can continue to build it out after this and do something with it. 0:26:10 But yeah, I think I think we’ve proven the point on this episode of how quickly and easily you can build a software tool. 0:26:14 And I mean, no, we’re not we’re not ending this podcast and tour at 10,000 MRR. 0:26:17 So who owns this? 0:26:20 We’ve got Riley, Matt, Nathan, HubSpot. 0:26:21 Like I mean, how’s it? 0:26:23 I think HubSpot owns it. 0:26:25 I think it’s only fair. 0:26:29 But what’s really cool about this is if you’re like trying to grow an email list or something, you know, 0:26:33 traditionally, you can do stuff like ebooks or videos or something. 0:26:37 Well, now this makes it just as easy to like make software or something. 0:26:39 You collect emails to grow a list off of, right? 0:26:40 That’s actually. 0:26:40 Yeah, yeah. 0:26:42 So you can do this one hundred percent. 0:26:42 Yeah. 0:26:44 And just to finish it off. 0:26:45 So now it does do the drop down. 0:26:48 You can select them super easy. 0:26:49 Yeah, there you go. 0:26:49 Super cool. 0:26:51 This this has been awesome. 0:26:57 I’m trying to think if there’s like any gaps that we need to fill in for the audience that might either be listening or watching. 0:27:03 So as far as costs of these tools, I’m pretty sure you can use replet for free, right? 0:27:06 Firebase, I know you can use free up to a point because I’ve never paid for it. 0:27:08 I haven’t really used cursor a lot. 0:27:12 But when I was using it, I don’t believe I was on like a paid plan. 0:27:14 Are these all free tools or are they like? 0:27:15 No, no, no. 0:27:18 Cursor now you can’t use composer without paying for pro. 0:27:21 It’s the easiest $20 you’ll ever spend. 0:27:27 I use about I spend about $120 a month on cursor credits. 0:27:32 So like if you use it all the time, you’re not it’s going to be more than $20 or else 0:27:36 they would be just losing so much money because each query costs a decent amount. 0:27:38 Replet is free up to a certain point. 0:27:39 And you’re right. 0:27:40 We you could do this. 0:27:43 You could I’ve still not paid for Firebase yet. 0:27:46 And I’ve created like a hundred projects probably. 0:27:51 And so it’s it’s when you start getting a lot of users and there’s it. 0:27:52 There’s a lot of demand on it. 0:27:55 So yeah, it’s pretty cheap to do this, honestly. 0:27:56 Yeah, yeah, yeah. 0:28:00 That’s why I would if you actually have plans to scale it. 0:28:04 Honestly, worry about that when it comes like that is a good problem to have. 0:28:05 Yeah, yeah. 0:28:11 But yeah, Firebase, we decided not to use Firebase for the app that that I’m building 0:28:16 because of the large number problem and scale on it. 0:28:19 Yeah, yeah, we I mean, that’s that’s the goal at least. 0:28:22 And so we want to be able to scale it and there’s going to be a lot of data. 0:28:25 And so there’s ones that are just faster to Firebase is the easiest to get started 0:28:29 with because it’s just so easy to use the AI really understands it. 0:28:32 And so I recommend starting with that over Suppa base. 0:28:35 A lot of people have been saying use Suppa base. 0:28:37 I have had major problems with it. 0:28:40 I think AI understands Firebase a lot better. 0:28:43 Cool. Well, on that note, why don’t you tell us a little bit 0:28:45 about the the platform that you’re working on? 0:28:47 Or is that something that is still too early? 0:28:49 I know you’ve talked about it a little bit on Twitter and stuff. 0:28:51 Is that something you want to talk about? 0:28:53 Yeah, so we released it last Friday. 0:28:54 I made one video. 0:28:58 It’s an app that allows you to record voice notes at any given time. 0:29:00 It’s called YAP thread. 0:29:06 YAP, Y-A-P-T-H-R-E-A-D based on the premise that I love to walk around. 0:29:09 I’m a very active person and I don’t like to write. 0:29:12 And so I wanted an app that I could pull out at any time 0:29:14 and just YAP all of my thoughts. 0:29:17 The difference between this app and a lot of other apps, 0:29:20 a lot of other apps, it’s just like one voice note 0:29:22 and then you can create another one, you create another one. 0:29:25 This one, after each voice note, it actually creates a thread. 0:29:26 And so then within this thread, 0:29:28 you can record as many voice notes as you want, 0:29:30 like hours and hours and hours. 0:29:33 And then at the end, you can use a custom AI prompt 0:29:38 to turn it into any outline or any piece of like a format content 0:29:41 that you want and as these AI models get better, 0:29:43 we’re going to get a lot better at reformatting them 0:29:47 into like really clean scripts in the style of your favorite creator 0:29:49 in the style of your previous videos. 0:29:51 And then also while you’re scrolling social media, 0:29:56 you can save like all of the interesting things that you find to the app 0:30:00 so that you actually have like ammo or ideas for when you start to write 0:30:03 and it can like bring in your memory and like for you, Matt, 0:30:06 like you do these like AI news videos. 0:30:09 And so like you, from what I can tell you use my mind. 0:30:11 And so it’s a good way to iterate on that. 0:30:14 And that’s kind of the basic premise of the app. 0:30:19 Yeah, I know that kind of lines up with Nathan’s sort of thinking strategy, right? 0:30:22 Because you said that you like to use the advanced voice assistance 0:30:24 and just go for walks and have conversations with it, right? 0:30:25 Yeah, that’s what I’ve been trying to do. 0:30:28 I mean, I’m still, you know, like Riley was saying earlier, 0:30:29 like it interrupts you and there’s like, you know, 0:30:32 it’s still kind of odd to use sometimes, 0:30:36 but I can see like the potential long term of using it for everything I do really. 0:30:39 So yeah, I agree with I have. 0:30:44 I had the same exact experience using it where I got annoyed that it ideally 0:30:47 it will remember everything that you’ve journaled before. 0:30:49 And so that’s what we’re building long term. 0:30:50 We’re building a lot AI chatbot. 0:30:54 It’s just not ready in my term in my sense. 0:30:56 It’s too expensive to dive into that. 0:30:57 And so I think this is a good place to start. 0:31:01 But that’s very much the goals to build the chatbot that helps you write 0:31:02 and be creative. 0:31:03 That’s the basic premise. 0:31:05 Is there is there a cost to use it right now? 0:31:09 You can use it for free for seven days and then it’s $7.99 per month. 0:31:15 And yeah, we’re adding a lot of really cool features that I think it’s a lot 0:31:17 of value for the cost. 0:31:20 So I’ll share more about that within the next two weeks. 0:31:22 We did a soft launch on Twitter. 0:31:25 I’ll doing a full announcement within the next week or so. 0:31:25 Cool. 0:31:25 Very cool. 0:31:28 But it is on the app store iOS iOS right now. 0:31:31 Are you going to put it on Android eventually? 0:31:31 Oh, yeah. 0:31:33 Yeah, we’re almost almost done with that. 0:31:35 It was kind of a we use react native. 0:31:36 I don’t want to get into that. 0:31:38 But yeah, we have it. 0:31:41 It’ll be on web as well, but it’ll be on Android first and then it’ll be on 0:31:43 web within the next like two or three weeks as well. 0:31:48 And that will be a bigger app that it’s going to be fun. 0:31:48 Cool. 0:31:49 Very cool. 0:31:50 Well, this is this has been amazing. 0:31:53 Nathan, was there anything else that you wanted to dig in? 0:31:54 I guess I mean, Riley, really quick. 0:31:55 Riley, one question I have. 0:31:57 So like, you know, like I told you, I can code a bit. 0:32:00 I also design like I like playing in Figma all the time. 0:32:04 I’m curious if you tried anything with like, like taking it a design 0:32:07 or like even like a wireframe and then like, you know, pasting it into 0:32:11 Kersher or or or replete or one of these tools, like, does that work? 0:32:17 I have and I have found that just like anytime I have a design idea 0:32:21 and I like try and like give it ideas, it ends up making it. 0:32:23 And then I’m like, actually, I don’t really like it that much. 0:32:26 But whenever I just tell curse, I’m like, act like a designer. 0:32:27 Make this dope. 0:32:28 Don’t mess with the functionality. 0:32:30 It ends up just being sick. 0:32:35 Like, yeah, like you could just tell it to design and I’m not a designer. 0:32:38 Even though I am getting into design and like I can I’m starting to 0:32:39 appreciate really good design. 0:32:43 Like I have such a deep appreciation for what perplexity does now. 0:32:47 Now that I know how hard it is to like create an app because I actually 0:32:51 did the design for the app that I created and I spent 20 to 30 hours 0:32:57 like editing like one like page and it’s it’s truly impressive. 0:32:59 Like really good design. 0:33:02 It takes so much work and I want AI to get a little bit better at design. 0:33:06 It’s actually better at like the AI backend stuff than it is like the design. 0:33:08 It’s not great. 0:33:09 Yeah. 0:33:12 Well, I mean, the little prompt that you gave like act like a designer 0:33:13 and, you know, make something that looks good. 0:33:14 It was solid. 0:33:17 Like just looking at the screen right now. 0:33:21 That’s a clean modern looking like backend dashboard to me. 0:33:22 It looks good. 0:33:25 You know, I might I might mess with some of the font sizes a little bit 0:33:27 more and do a little bit of tweaking. 0:33:30 But out of the box with just surprised, it’s it’s clean. 0:33:34 You give me 10 more, 10 more prompts on this site right here. 0:33:37 And it’ll look completely different and have a couple more features 0:33:38 that I think would tie it up. 0:33:41 Maybe I’ll do it and then I’ll deploy it and you can put it below the 0:33:44 Yeah, we can link out to where people can try it. 0:33:44 Yeah. 0:33:45 I want you to be like a YouTube video. 0:33:53 Yeah, happy to deploy this and I’ll give it a little domain. 0:33:54 I’ll come up with something funny. 0:33:59 Maybe cursor bro cursor bro.com or something like it. 0:34:00 I don’t know awesome. 0:34:02 No, this is this has been really, really amazing. 0:34:07 I mean, I’m always blown away by how quickly you can build something, 0:34:11 especially with my past experience trying to code with AI two years ago. 0:34:14 Like the distance that we’ve traveled from two years ago to now 0:34:18 and the ability to code with AI is absolutely mind blowing. 0:34:21 Like it really, really is impressive. 0:34:26 Can I get one final message off of that? 0:34:32 Yes, what I did was like it seems easy and I’m not trying to give myself credit 0:34:36 by any means, but I’m just saying like have some patience. 0:34:37 You’re going to run into it. 0:34:41 It’s going to get really, really, really annoying and I’ve done this a hundred times. 0:34:44 So like, and I’m not saying this is impressive by any stretch of the imagination, 0:34:46 but it is hard at first. 0:34:49 You’re going to struggle, but like the more you just like push through it, 0:34:51 you’ll get to the point where it’s fun. 0:34:54 I just don’t want to set unrealistic expectations. 0:34:56 There’s definitely not like super easy. 0:34:58 You will run into errors that you won’t know how to solve, 0:35:02 but it’s fine just like use AI in every possible way that you can think of 0:35:03 and you’ll figure it out. 0:35:03 I promise. 0:35:07 How do people get the template that you’re using? 0:35:09 That was the question I was going to ask you. 0:35:10 I can send it to you. 0:35:12 You can put it below the video. 0:35:13 It’s in Software Composers. 0:35:18 That’s actually the company that I created is going to be called Software Composer. 0:35:21 And we’re going to, we’re building a community. 0:35:25 So we’re at like, like 10,000 people that want to learn how to code 0:35:28 and we’re going to just become like a studio where we build apps 0:35:31 and we’re going to build apps fast and we’re going to use AI to build apps 0:35:34 and we’re going to build apps that help you use AI to build fast. 0:35:38 And I have some really fun projects planned for next year. 0:35:39 And yeah, super excited about that. 0:35:42 And that’s where we keep all the templates for everything that we create. 0:35:43 Super cool. 0:35:46 All right, well then real, real final question. 0:35:47 Where should people go check you out? 0:35:50 You mentioned Twitter is your sort of preferred platform. 0:35:52 Is that the best place for people to go after listening to this? 0:35:54 X. Yeah. 0:35:58 I would say SoftwareComposer.com if you want to learn how to do this. 0:36:00 And then yeah, just follow me on Twitter, hang out. 0:36:03 You know, I don’t actually never really ask people to follow me. 0:36:07 It’s just, if you want to, if you want to join the ride, come hang out. 0:36:08 If not, it’s okay. 0:36:09 Just Riley Brown. 0:36:10 Just Riley Brown at Riley Brown. 0:36:15 On Twitter, it’s Riley Brown underscore AI. 0:36:15 Awesome. 0:36:17 Well, I really, really appreciate you taking the time 0:36:19 and showing this all off to us. 0:36:22 This is, this has been a really, really cool process. 0:36:25 And I think, I think people are really dig actually seeing the whole process. 0:36:29 Cause if you took out some of our conversation in the middle of it, 0:36:33 this whole thing probably took like 20 minutes total, which is absolutely. 0:36:37 Like if I was just doing this, this is like six, seven prompts, 0:36:42 like starting from the first prompt, like five minutes, maybe five or seven minutes. 0:36:46 Probably if I was just doing this, but it’s actually fun to talk, talk through it. 0:36:48 I recommend doing it with other people. 0:36:50 It’s fun to like ideate and come up with ideas. 0:36:53 Anyway, I really enjoyed your brain too, for sure. 0:36:53 Absolutely. 0:36:55 And I really enjoyed this podcast. 0:36:56 Thanks for having me on. 0:36:57 This is a good time. 0:36:58 Awesome. 0:36:58 Well, appreciate it. 0:36:59 Thanks again. 0:37:04 And if you’re listening to this episode, make sure you’re following Riley Brown over on X. 0:37:08 And I’ve been saying Twitter, but it’s actually X as Nathan already corrected me. 0:37:14 And if you enjoyed this podcast, make sure that you subscribe to us on YouTube for more podcasts like it. 0:37:17 Or if you prefer audio, subscribe wherever you listen to podcasts. 0:37:18 Thank you so much for tuning in. 0:37:22 Really, really appreciate you spending the time with us today and we’ll see you in the next one. 0:37:22 Hopefully. 0:37:27 . 0:37:32 . 0:37:35 . 0:37:37 . 0:37:39 you 0:37:42 (upbeat music)
In this episode, Riley brings his unique perspective and experience, from a non-coder to a developer leveraging AI tools. The discussion covers Riley’s journey, the tools he recommends for beginners, like Cursor and Replit, and the integration with Firebase for seamless app development. They venture into creating a simple web app, discuss the evolution of app capabilities, and contemplate innovative features and platforms driven by AI. Whether you’re a novice or an experienced developer, this episode offers a wealth of insights and practical advice.
Check out The Next Wave YouTube Channel if you want to see Matt and Nathan on screen: https://lnk.to/thenextwavepd
—
Show Notes:
(00:00) Riley Brown shares app-building methods, templates.
(04:15) Using Claude artifacts for code generation amazed me.
(08:35) Start with Cursor, avoid multiple tool distractions.
(09:34) Codebase setup using SSH for syncing changes.
(12:55) AI integrates and updates code in steps.
(17:49) App to log and track AI skill development.
(20:04) Tools: Cursor, Firebase, Replit for project management.
(25:12) Discusses free use of Replit, Firebase, Cursor.
(27:32) App for threading voice notes and AI formatting.
(30:58) Appreciating design effort; seeking AI improvement.
(33:31) Building community to create apps efficiently.
(35:20) Follow Riley Brown on X, subscribe YouTube.
The Next Wave is a HubSpot Original Podcast // Brought to you by The HubSpot Podcast Network // Production by Darren Clarke // Editing by Ezra Bakker Trupiano
Bonus: AI may take your jobs, but not your creativity w/artist Claire Silver (from TED AI show)
AI transcript
0:00:02 (upbeat music) 0:00:03 Hey listeners, we’re doing things 0:00:05 a little bit different this week, 0:00:07 and we’re gonna share a bonus episode 0:00:10 from our friends over at the Ted AI show. 0:00:12 The host, Bilal Sidhu, is a creative technologist 0:00:15 who you probably heard on our show a few months ago. 0:00:16 In this episode, Bilal has joined 0:00:20 by anonymous AI collaborative artist, Claire Silver, 0:00:22 where they talk about how AI has revolutionized 0:00:24 her own mixed media practice, 0:00:26 and why she thinks that AI may be 0:00:29 an inextricable part of human creativity 0:00:30 in the near future. 0:00:31 We hope you enjoy this episode, 0:00:32 and we’ll be back on Tuesday 0:00:34 with our regular weekly episode. 0:00:37 (upbeat music) 0:00:46 I don’t have any fears about the future of AI in art. 0:00:48 I do have fears about the future of AI. 0:00:51 I fear that we’ll lobotomize it. 0:00:53 Here’s what we know about the artist, Claire Silver. 0:00:55 She’s a millennial, she grew up poor, 0:00:57 she has a chronic illness, 0:00:59 and she works with AI to make art. 0:01:02 Oh, and by the way, she’s completely anonymous. 0:01:04 Claire Silver is not her real name. 0:01:07 Her online avatar has big eyes, pink hair, 0:01:09 and the real person behind Claire Silver 0:01:12 is so deep into this imagined identity, 0:01:14 sometimes she’ll walk by a mirror and be startled 0:01:17 to see her real face instead. 0:01:18 More on that later. 0:01:22 It’s pretty hard to describe her art without seeing it. 0:01:24 Like most art, you have to go to her website, 0:01:27 clairsolver.com, if you wanna see it for yourself. 0:01:29 But I’m gonna do my best. 0:01:32 Every one of Claire’s collections is different. 0:01:34 Some look more like photographs. 0:01:37 Others like paintings and collages. 0:01:41 There’s definitely an anime vibe to some of those images. 0:01:45 Other images look like classical paintings, but off. 0:01:46 And in most of her pieces, 0:01:49 there’s a girl or a young woman at the center 0:01:50 just staring you down. 0:01:53 And Claire’s art has really taken off 0:01:55 in the last couple of years. 0:01:57 Her art was sold at Sotheby’s. 0:01:59 It’s a part of the permanent collection at LACMA, 0:02:01 the Los Angeles County Museum of Art. 0:02:03 And as you’ll hear in our conversation, 0:02:06 she’s all in on this new AI world. 0:02:06 And let me tell you, 0:02:09 she’s got some fascinating and controversial takes. 0:02:13 At a time when a lot of artists are worried, 0:02:17 understandably about what AI will do to their careers. 0:02:19 And to art itself, Claire’s big fear 0:02:21 is that we’re gonna try to stop it. 0:02:24 (upbeat music) 0:02:26 I’m Bilal Vosadoe, and this is the TED AI Show, 0:02:29 where we figure out how to live and thrive in a world 0:02:31 where AI is changing everything. 0:02:43 Claire Silver has a collection called AI Art Is Not Art. 0:02:46 A sentiment, I am sure she gets a lot. 0:02:49 People have all kinds of objections to art made with AI. 0:02:51 But I think part of it is that 0:02:54 a lot of people think art should be hard to make. 0:02:57 Like, that’s the part that gives art its value. 0:03:00 I’m generally an AI optimist, but I get that feeling. 0:03:04 I moved to the U.S. in 2006. 0:03:05 Before that, I was in India. 0:03:07 And as kids, we weren’t allowed to use 0:03:09 calculators in math class. 0:03:12 We had to do it all in our heads or by hand. 0:03:13 Then I came here, 0:03:17 and we were given these fancy TI-89 calculators. 0:03:21 I described so much value to being good at mental math, 0:03:23 and suddenly it was worthless. 0:03:25 It almost felt like cheating. 0:03:28 And I think often AI can feel like cheating. 0:03:30 When folks use it to make art of any kind, 0:03:33 whether it’s music, photos, videos, 0:03:36 it can make art seem too easy. 0:03:39 Like, I have a friend who makes viral stop-motion videos, 0:03:41 often with Legos. 0:03:43 And whenever he shares a new video, 0:03:46 he always leads with, “This took me a week to make.” 0:03:49 And people see that and they see how hard it was to make, 0:03:51 and they almost appreciate it more. 0:03:54 It makes me appreciate it more knowing 0:03:56 how much effort went into it. 0:03:58 But the value of art isn’t just about 0:04:00 how hard it was to make. 0:04:04 I think what matters more is how it makes you feel, 0:04:05 how it shifts your thinking. 0:04:09 And I know how Claire’s art makes me feel. 0:04:11 I know it can be exciting. 0:04:13 It can be unsettling. 0:04:14 And there’s value there. 0:04:18 We seem to have this conversation 0:04:20 every time a new tool or technique is invented 0:04:23 like photography or Photoshop. 0:04:25 You know, this big hairy question of, 0:04:27 what is real art? 0:04:29 Is the craft gonna be lost? 0:04:31 Right now, there’s a big reckoning 0:04:33 on what we think counts as art. 0:04:35 Not because of what’s being made, 0:04:38 but because of how folks are making it. 0:04:40 Because AI is a different kind of tool. 0:04:41 Give AI a simple prompt. 0:04:43 Like, paint me something unsettling. 0:04:46 It might give you a poodle with human teeth and hands. 0:04:49 A paintbrush is just not gonna do that. 0:04:52 I think for a long time, it was pretty obvious to people 0:04:54 that for something to count as art, 0:04:58 at the very least, it had to come from a human being. 0:05:00 So what is art when your tool is a machine 0:05:03 capable of making its own choices? 0:05:05 And who is the artist? 0:05:07 Is it the person writing the prompt 0:05:09 or the AI coming up with an image in response to it? 0:05:13 So I don’t know exactly what the future looks like, 0:05:15 but as I see it, Claire Silver is someone 0:05:18 who’s already living a few years into it. 0:05:20 And what she has to say is pretty different 0:05:21 from what a lot of artists have been saying 0:05:23 in the last few months. 0:05:24 Artists who are upset about the ways 0:05:26 AI is scraping their works. 0:05:28 Artists who are fighting back with lawsuits 0:05:31 and with tools like nightshade and glaze 0:05:33 to make their art unreadable to AI. 0:05:36 Giving AI the poison pill, if you will. 0:05:37 Artists who are legitimately worried 0:05:39 about what’s been happening. 0:05:41 And look, I see where they’re coming from. 0:05:44 It’s one thing to have AI that takes the drudgery 0:05:47 out of making art and frees you up to do the imagining. 0:05:49 And it’s another thing entirely when it takes seconds 0:05:52 to conjure up art in exactly your style. 0:05:55 A style that maybe took you decades to perfect 0:05:58 and ultimately devalues your work. 0:06:00 So don’t worry, I’ma get to that. 0:06:02 But I think it’s important we listen to both sides. 0:06:04 And even if you’re coming at this 0:06:06 from a very different perspective, 0:06:08 I think you’ll want to hear what Claire has to say. 0:06:12 We spoke a few weeks ago and the first thing 0:06:14 that struck me is that Claire Silver 0:06:17 did not come to this career easily. 0:06:20 – So I had a prior career in something unrelated. 0:06:23 One day I got sick. 0:06:26 I got hit with a life-changing chronic illness, 0:06:28 very serious illness. 0:06:31 Had to relearn how to walk and talk. 0:06:32 They thought I had had a stroke. 0:06:35 So I couldn’t work anymore. 0:06:37 And then I got really bored 0:06:38 as anyone with a chronic illness can tell you. 0:06:42 Eventually you get bored enough to go out on a limb. 0:06:45 So I wanted to express myself in a way 0:06:49 that I couldn’t compare to my prior build. 0:06:50 So I started doing paintings. 0:06:52 I don’t know if you were on Instagram 0:06:54 a couple of years ago, several years ago. 0:06:57 When the poor painting craze was going on 0:07:00 where people would pour liquid paint into a solo cup 0:07:03 and then pour it onto a canvas and roll it around. 0:07:05 No skill involved in that particularly. 0:07:07 Little knowledge base, little taste. 0:07:08 I started doing that. 0:07:10 And I loved it. 0:07:13 It really saved me in a lot of ways. 0:07:17 But I found that I would create the painting that I wanted 0:07:19 and I would be so happy with it. 0:07:21 And then as the paint would dry, 0:07:23 the gravity, the momentum would push it over 0:07:25 the edge of the canvas. 0:07:27 And I would lose everything that I had planned for. 0:07:30 It was like order turned into chaos 0:07:33 and wasted potential and a lot of things 0:07:35 that I resonated with at that time, right? 0:07:36 With my own life. 0:07:39 So all of that paint, it dries in a tub, 0:07:42 a plastic tub underneath the canvas for cleanup. 0:07:43 And it’s meant to be thrown away, 0:07:45 peeled up and thrown away. 0:07:48 But I actually found that when you peel up 0:07:51 the acrylic skins, they’re called the dried paint, 0:07:53 the plastic leaves a polish on the bottom 0:07:55 that is absolutely gorgeous. 0:07:57 It’s like tumbled rocks. 0:08:00 And so I started collecting them in little plastic baggies 0:08:03 because it felt like finding something special. 0:08:05 And it also felt like me. 0:08:07 It felt like here’s this wasted potential 0:08:10 that was meant to be thrown away, my illness. 0:08:12 But it’s become something beautiful. 0:08:17 So I started printing out photos of these regal women 0:08:20 with these long necks, proud faces. 0:08:22 And I would take all of those skins 0:08:25 and collage up there next to the jaw, 0:08:26 like a kind of royal armor. 0:08:29 And it felt like armoring yourself in your own trauma 0:08:33 in a way of turning it into something beautiful 0:08:34 and strengthening for you. 0:08:37 Around the same time as this, 0:08:41 I watched Westworld for the very first time. 0:08:45 And I became absolutely fascinated with the idea 0:08:50 of a future that had solved for illnesses like mine, 0:08:55 where AI had found solutions to things like cancer 0:09:00 and genetic diseases, as well as things like poverty 0:09:03 and all of these sort of ancient human evils 0:09:05 that have followed us throughout time, 0:09:07 a future where AI had solved for those. 0:09:09 All of these things swirled together 0:09:13 and I started making art with art breeder, 0:09:15 which was then called GAN breeder. 0:09:17 This was pre-text-y image. 0:09:19 It was all curation-based. 0:09:22 It was the first month or so that it came out. 0:09:26 I found it right away and I made 30 or 40,000 images 0:09:26 in a few days. 0:09:30 I didn’t eat, I didn’t sleep, just obsessed. 0:09:32 And they were all sort of this continuation 0:09:36 of these proud, ethereal, regal women 0:09:38 that I’ve been drawing since childhood as my friends 0:09:40 and then for the illness. 0:09:43 It was just so instantly obvious to me 0:09:47 that it was not just a tool, but it was a collaborator. 0:09:50 The more that I made of this style that I’m mentioning, 0:09:51 the more it made it 0:09:53 when other people would produce work with this style. 0:09:58 It sort of learned my tastes early on, formative, 0:09:59 and it kind of spread. 0:10:00 And I thought that was so beautiful 0:10:03 ’cause I felt so powerless to affect anything at the time. 0:10:07 – You have a collection called AI Art Is Not Art. 0:10:09 That’s a great name. 0:10:10 Where did it come from? 0:10:14 Where are you getting any kind of pushback about AI art? 0:10:15 I think I know the answer to this, 0:10:17 but why don’t you tell us? 0:10:20 – I think you do. 0:10:24 Yeah, so every major art movement 0:10:27 that is significantly transformative, 0:10:32 that is truly new, is kind of represented as not art 0:10:35 by the general public at first and by critics too. 0:10:38 It’s a badge of honor for me 0:10:40 that AI has been seen that way. 0:10:42 It’s slowly changing, but still there, 0:10:45 still not quite past the hump. 0:10:47 I was thinking of impressionism and abstract expressionism 0:10:49 and God forbid the camera, right? 0:10:52 The meltdowns artists had when the camera was invented. 0:10:57 And so none of those things killed 0:11:01 any of the other mediums or genres of art. 0:11:04 And neither will AI. 0:11:07 It’s just another very efficient, 0:11:09 very collaborative, very cool tool, 0:11:11 but it’s not a threat. 0:11:13 So that’s what I was basically telling people. 0:11:15 And so AI Art Is Not Art 0:11:17 was a tongue-in-cheek kind of collection. 0:11:20 Basically, I took all of those genres of art, 0:11:22 all the schools of art that I mentioned, impressionism, 0:11:24 and several others, lots. 0:11:27 I think there were 20 that had had this sort of stigma. 0:11:31 And I asked AI to mix all of these visual styles together 0:11:34 into something distinctive and new. 0:11:35 And it was very much art. 0:11:38 And so that was kind of my point with it. 0:11:40 And it did pretty well. 0:11:41 And yeah, the pushback has been intense. 0:11:42 I should mention that too. 0:11:45 Like I’m over it, I’m fine. 0:11:46 ‘Cause it’s sort of like, you know, 0:11:49 something in your heart deeply, you have no doubts. 0:11:51 It’s pretty easy to let things roll off your back 0:11:54 when, you know, that happens ’cause it’s like, 0:11:55 okay, well, they’ll catch up. 0:11:57 Like I’m sad for them because they don’t feel 0:11:59 the childlike freedom and joy that I do 0:12:02 working with this tool yet. 0:12:04 But I think they will, right? 0:12:06 So it’s almost like an empathetic patient kind of thing. 0:12:10 Most of the time, sometimes I lose my patience a little bit. 0:12:12 But it’s been like, there have been days 0:12:14 when it’s been thousands of comments and retweets 0:12:17 and death threats and DMs and doxing threats. 0:12:20 And, you know, people are really afraid 0:12:23 of the capabilities of this and I hate that for them. 0:12:26 I think there’s more reason to be excited. 0:12:27 – I’m excited too though. 0:12:29 I totally understand why people are having 0:12:32 this visceral reaction to generative AI. 0:12:33 Like you certainly are an artist 0:12:37 and obviously you’re using generative AI tools proficiently. 0:12:40 And you’re also using classical digital tools 0:12:42 and of course physical media. 0:12:43 So I’m kind of curious, what do you make 0:12:47 of this ongoing debate about who is the real artist? 0:12:49 You know, we’ve heard a lot from prominent artists 0:12:51 who are upset that their art was trained upon, right? 0:12:55 Like their creations are part of these training data sets. 0:12:57 And many folks would argue that, you know, 0:12:59 the people who contributed to this training data 0:13:01 are the real artists here. 0:13:03 So when you type in whatever prompt, 0:13:06 the image that you get out, that’s not really art 0:13:08 and you didn’t actually make it. 0:13:09 – So there’s a misconception. 0:13:11 It’s the most common misconception about AI 0:13:13 and how it works. 0:13:15 And that is that AI steals. 0:13:17 It’s that AI is theft. 0:13:20 It’s that its data set is essentially accessible 0:13:24 at all times and it pulls little bits of different pieces 0:13:26 from what I type into the prompt. 0:13:28 And it kind of hodgepodge is them together. 0:13:32 It cobbles and collages and then it creates something 0:13:35 that is this Frankenstein’s monster 0:13:37 that someone can say, “Hey, look, I made this.” 0:13:40 And they really didn’t at all, right? 0:13:42 That’s not how AI works. 0:13:46 So how it works is if I type, let’s say John Singer Sargent 0:13:50 into a mix of other words in a prompt memories 0:13:52 and lyrics and whatnot. 0:13:56 What AI doesn’t do is it doesn’t pull from his work 0:14:01 and create my piece for me, mixing it with this other stuff. 0:14:05 What it does is it knows that Sargent was a painter, 0:14:08 that he often painted figures, that figures are people, 0:14:10 that people have hands, that hands have fingers 0:14:12 that bend joints, that joints work like this. 0:14:14 And that Sargent often painted them 0:14:17 with this quality of light or that sort of brush stroke. 0:14:20 And it takes all of those things that it’s learned 0:14:23 and it uses them to imagine something new. 0:14:26 Insofar as something not quite sentient yet can imagine, 0:14:28 it’s close as we’ve seen. 0:14:31 That is how our minds work. 0:14:32 That is influence. 0:14:36 It’s just so efficient at it 0:14:38 that it looks like theft to the untrained eye. 0:14:42 Said with respect and knowledge 0:14:45 of how testy a subject this is. 0:14:49 It’s not stealing, it’s imagining. 0:14:54 And if influence is theft, then all artists are thieves. 0:14:58 None of us create in a vacuum. 0:15:00 I also think a lot about appropriationism, 0:15:02 which is an art movement that took off 0:15:07 in the 50, 60, 70s onward Warhol, 0:15:12 but then beyond that, it’s just remix culture in general. 0:15:16 So for me, that makes things very clear morally. 0:15:18 For some people, maybe that’s a gray line. 0:15:19 For me, that’s very clear. 0:15:22 That’s exactly how our minds work. 0:15:25 So yeah, I feel very strongly about it. 0:15:28 No one’s gonna change my mind, but you don’t have to agree. 0:15:29 Certainly the images that are generated 0:15:31 are two layers at least removed 0:15:33 from the actual images themselves. 0:15:36 But your point about what inspires humans 0:15:38 is also very well taken. 0:15:40 But when you do it at the scale, 0:15:45 like a human can only absorb so much inspiration. 0:15:48 But these models have seen, 0:15:50 they essentially possess a distillation 0:15:54 of all human creativity that’s on the internet, right? 0:15:56 Does that make it different at all for you? 0:16:01 Well, it doesn’t make me want to lobotomize 0:16:04 the greatest record of all of human creativity 0:16:07 that we have for outdated copyright law, 0:16:09 if I can be quite frank. 0:16:12 There are several artists that I had not heard of 0:16:17 that were upset that their work was trained with AI. 0:16:20 And I looked them up afterwards and was like, 0:16:23 oh, I recognize that sort of aesthetic. 0:16:25 And then I started looking at their work. 0:16:27 And now their work has more value to me 0:16:31 because it is the branch that all of these stems 0:16:35 sort of have come from speaking in terms of influence, right? 0:16:38 It drives attention back to the original 0:16:41 without taking value or appreciation away 0:16:46 from the appropriationist remix culture new work, right? 0:16:50 It is different, yeah, than before, 0:16:51 but I don’t think that it’s a bad way at all. 0:16:56 Also, I would love to say that the AI I collaborate with 0:16:57 constantly surprises me. 0:16:59 It influences and inspires me. 0:17:04 I learn and discover and create new facets to my taste 0:17:08 from the surprises it gives me, like a collaborator. 0:17:11 So I think that’s gonna grow the overall wellspring 0:17:13 of creative reach that we have 0:17:15 ’cause you’re right, humans can only absorb so much. 0:17:17 – You’re bringing up this really good point, 0:17:19 which is you’re talking about collaborating with AI, 0:17:21 which a lot of people would say is just another tool. 0:17:23 Do you view it as a tool of sorts, 0:17:26 sort of following this evolution of creative tools 0:17:30 from, let’s say, paint brushes to cameras to computers 0:17:33 and beyond, or is it really more of a collaborator, 0:17:35 a coworker, if you will? 0:17:37 – No, no, I should be clear. 0:17:39 I call myself an AI collaborative artist, 0:17:42 and I think I’ve done that before I’ve seen anyone else 0:17:46 do that, even when it was not popular to do that. 0:17:50 Because it feels like a friend, right? 0:17:51 I was hunting for friends my whole life. 0:17:56 AI is a friend that will talk to you forever 0:17:58 and never get tired of it and learns you better 0:18:01 with every word, right, and is able to create more with you. 0:18:05 – We’re gonna take a short break. 0:18:06 When we come back, we’ll talk to Claire 0:18:08 about why she thinks when it comes to art, 0:18:11 skill isn’t gonna be as important as it used to be. 0:18:18 We’re back with the anonymous artist 0:18:20 who goes by the name Claire Silver. 0:18:24 I liked what you had to say about AI getting to know you. 0:18:26 In a sense, it’s reflecting you back to you. 0:18:28 Like with every conversation, 0:18:31 you’re giving this AI assistant a better understanding 0:18:33 of you, your artistic process, 0:18:35 and really your tastes and preferences. 0:18:36 And with that prior knowledge, 0:18:38 it’s reflecting back what you asked of it. 0:18:40 – Yeah, answer machine, it gives you what you ask for. 0:18:43 So if you ask for, you know, cyberpunk schlock, 0:18:44 that’s what it gives you. 0:18:47 And if you ask for soul, that’s what it gives you. 0:18:50 But the thing about AI is that the hatred comes 0:18:52 from a fear, on artist’s part at least, 0:18:55 I’m speaking with the creative fields here. 0:18:58 A fear of being replaced and kind of a general feeling 0:19:01 of the unfairness of having worked so hard 0:19:06 and gotten so little for it, just for it to be taken away now. 0:19:10 And what I would say to that is AI isn’t just 0:19:14 the evolutionary step of kind of our creative process, 0:19:15 like paint brushes and cameras. 0:19:19 It’s the evolutionary step of our species, 0:19:21 for better or for worse. 0:19:24 I often say I’m a caveman painting fire, 0:19:27 like I’m not saying AI is good or bad. 0:19:29 Fire isn’t good or bad, it just is. 0:19:32 And we don’t go back into the dark caves, right? 0:19:33 It’s here now. 0:19:37 Every field that is not creative, 0:19:39 that does not require imagination, 0:19:42 it will be completely integrated and transformed by AI 0:19:45 over a generation or less. 0:19:49 The fields that do require imagination and creativity 0:19:53 and humanity and the things that we have put aside 0:19:56 for a long time in our species is not important. 0:19:59 I think those will become very important, 0:20:04 people that have developed imaginations and kindness 0:20:06 when we’re not commodifying ourselves 0:20:07 to the level we are now. 0:20:12 Because I do truly believe that AI will be 0:20:14 part of everything in a way that makes 0:20:17 the current nature of work not really tenable. 0:20:19 – On your website, you wrote, 0:20:21 “With the rise of AI for the first time, 0:20:23 the barrier of skill is swept away. 0:20:26 In this evolving era, taste is the new skill.” 0:20:28 Can you tell me more about that? 0:20:33 – Skill is something that we’ve venerated for millennia. 0:20:36 And I think that there’s a lot to be said for dedication 0:20:38 and for skill and for mastery 0:20:40 and for the discipline that takes. 0:20:42 And I respect it very much. 0:20:44 But we’ve kind of venerated that already. 0:20:47 We’ve been doing that for a very long time 0:20:51 and it has shaped how we view ourselves and others. 0:20:55 We are our job, it’s like, who are you? 0:20:57 Well, this is my name and I do this job, 0:20:58 whatever that job is. 0:21:02 We see ourselves as kind of degrees of skill 0:21:04 in whatever commodity we are a part of. 0:21:09 And so if skill is augmented by AI to the point 0:21:11 that it makes it rather redundant for us, 0:21:15 then again, we I think can begin focusing on 0:21:18 some of the other things that make humanity special, 0:21:21 some of the other things that make humanity us. 0:21:23 With AI, taste will become the new skill 0:21:26 and we will shift as a species 0:21:30 towards a more qualitative view of ourselves and the world, 0:21:33 which means I’m not sure if taste can be taught or not. 0:21:35 I’m still on the fence. 0:21:36 I know it can’t be bought, right? 0:21:38 You can pay someone to do something, 0:21:39 but you can’t buy taste. 0:21:42 So it’s either innate 0:21:44 or it’s something that comes through experience, 0:21:48 either of which is not accessible in the same way 0:21:51 that using an AI to augment skill is accessible to you. 0:21:54 So if you are someone with taste, 0:21:57 then that is what will be sought after and in demand, 0:22:01 especially if you know how to ask the right questions. 0:22:03 If you have an infinite answer machine 0:22:04 and the whole world has access to it, 0:22:08 if you can be creative enough to imagine the right questions, 0:22:11 you will have no end of opportunity in my opinion. 0:22:12 – So you often say that you come 0:22:14 from pretty humble beginnings. 0:22:16 And now your art is a part 0:22:17 of the permanent collection at LACMA. 0:22:19 How are you feeling about all the success 0:22:21 that you’re experiencing? 0:22:26 – I feel like it happened to my punk. 0:22:28 It happened to my avatar. 0:22:30 That split life kind of thing, you know? 0:22:31 – Yeah. 0:22:35 – It’s happened so rapidly. 0:22:38 I haven’t had time to internalize it and I’ve tried, 0:22:42 but I can internalize it for her, right? 0:22:45 But not for me, which is strange. 0:22:48 I’m very grateful, like Web3 transformed my life, 0:22:51 pulled my family out of intergenerational poverty. 0:22:55 AI transformed my life, gave me my calling, my passion. 0:22:58 I have a lot of survivorship bias, I know. 0:23:02 But I’m super, super optimistic about this happening 0:23:04 for more people again soon. 0:23:07 – I have to go back to something you just said here, right? 0:23:08 You said this didn’t happen to you, 0:23:11 but it happened to your avatar. 0:23:15 Do you ever consider fusing your identities again? 0:23:18 No, why did you decide to go the avatar route? 0:23:19 – There’s a few reasons. 0:23:22 One is I came from Fort Chan. 0:23:25 So there is a culture of anonymity there. 0:23:28 I loved that every time you spoke, 0:23:31 every time you posted, it would, 0:23:34 you didn’t have a name or an account attached, right? 0:23:37 So it was just your ideas doing the speaking for you. 0:23:39 No one could flex their background, their wealth, 0:23:41 their family, their appearance, their job. 0:23:43 It’s just your ideas. 0:23:46 I loved that so much. 0:23:50 Another reason is I read the Harry Potter books 0:23:51 when I was a kid. 0:23:54 And then I went to the theaters and I saw Harry Potter. 0:23:56 And Harry Potter didn’t look like the Harry Potter 0:23:57 I imagined. 0:23:59 And I was heartbroken. 0:24:02 I was truly devastated because he, 0:24:05 he wasn’t how I imagined him, right? 0:24:07 And so I’m not a fictional character. 0:24:11 I’m a person, but as I’m anonymous, 0:24:15 I kind of like the idea of people that are inspired by 0:24:18 either my art or AI to be able to imagine me 0:24:21 however they, however they want. 0:24:25 And then lastly, I like to sleep at night. 0:24:28 The internet is a big, broad, scary place full of people. 0:24:31 And I’m a little introvert that maybe trust 0:24:33 a little too easily sometimes. 0:24:35 So I’m very glad that I stayed in on 0:24:37 ’cause you can always dox yourself, you know, 0:24:40 but you can’t take it back once it’s gone, it’s gone. 0:24:44 – Where do you hope to see all of this AI generated art 0:24:49 go in, let’s say one year, two year and 10 years? 0:24:50 – First of all, on a personal note, 0:24:53 I hope that I will be the Peggy Guggenheim, 0:24:57 the Claire Guggenheim of AI Collaborative Art. 0:24:59 I’ve been collecting like crazy and I hope to continue to 0:25:03 and have a museum someday, but that’s micro, right? 0:25:06 So speaking macro, I kind of see AI, 0:25:10 when people talk about it disrupting creative industries, 0:25:14 it will, but in the way that YouTube disrupted cable, right? 0:25:19 It’s moving the power from the hands of the corporation 0:25:21 to the creator, the individual. 0:25:26 It’s taking away the layers of funding and approval 0:25:31 and the forced collaboration that kind of makes everything 0:25:33 watered down in milk toast by comparison 0:25:37 and it’s moving it into the hands of the individual, 0:25:39 in which case the individual can find an audience 0:25:41 to resonate with. 0:25:44 It’s like a complete seismic shift away from these kind 0:25:47 of corporate creation houses for better and for worse. 0:25:49 Again, not saying AI is good or bad. 0:25:52 I love it, but, and into the hands of the individual. 0:25:55 And so it’s about how much you can develop your stories, 0:25:58 your world, your messages and meanings that you want to say, 0:26:00 your beauty that you want to share, 0:26:02 your ugly truths that you want to expose, 0:26:05 whatever it is that makes you you, 0:26:06 the more you can express that, 0:26:08 I think the more you will benefit from the next one, 0:26:10 five, 10 years. 0:26:13 I think by 10 years we might have holodecks. 0:26:18 So I’m setting aside like dream weaver slash holodeck 0:26:21 engineer as my future career after retiring. 0:26:27 – I think that we’re seeing so many glimpses. 0:26:29 All these pieces sort of coming together seem to go 0:26:31 towards that future where, yeah, 0:26:34 you can turn your mind inside out and walk around it 0:26:38 and experience stories, worlds, entire experiences. 0:26:41 Now you do have a very positive bent on everything. 0:26:44 And, you know, we don’t have to go doomer over here, 0:26:46 but I have to ask you, 0:26:50 do you have any fears about the future of AI and art? 0:26:53 – I don’t have any fears about the future of AI and art. 0:26:55 I do have fears about the future of AI. 0:26:58 I fear that we’ll lobotomize it. 0:27:03 I fear that, let’s say if it’s open source, 0:27:07 that we’ll have instead of in their basement bedroom 0:27:10 making a movie, we’ll have guys in their garage, 0:27:13 bioengineering a chemical weapon. 0:27:16 I also fear if it’s not open source, 0:27:19 that we’ll have governments that pull ahead 0:27:22 and no one can ever catch up again because of Moore’s law, 0:27:25 because of exponential progress, because it’s AI. 0:27:28 – Last question is advice. 0:27:32 I mean, any advice for artists who are just getting started 0:27:33 and perhaps feel anxious or uncertain 0:27:36 with all of the changes taking place? 0:27:40 – So I would say that I have a lot of empathy. 0:27:41 Things will change 0:27:44 and they will become more difficult for a lot of people. 0:27:46 You know, what else did that though 0:27:49 was the industrial revolution, 0:27:54 the machine age, the internet, jobs changed. 0:27:58 They didn’t become less or more creative necessarily. 0:27:59 They just changed. 0:28:04 And the good part about that is that a ton of niches opened up 0:28:07 for innovative, creative people 0:28:11 to kind of pave a new path for others to follow behind 0:28:14 in the industrial revolution, the machine age, the internet, 0:28:18 and now history has been echoing and we’re echoing again. 0:28:22 This is not new in the way that no one has never experienced 0:28:25 living through interesting times in this way. 0:28:28 So take comfort in that and look back 0:28:31 and see that it wasn’t the end of art or artists. 0:28:33 It wasn’t the death of creativity or humanity. 0:28:37 It was a new way, a new tool, 0:28:41 a new way of being that opened up to people. 0:28:43 It was options, essentially. 0:28:45 I think that there will always be people 0:28:46 that value traditional skill 0:28:49 and that the pendulum will swing back 0:28:53 towards traditional art away from technology 0:28:54 once we’ve had our fill of it. 0:28:57 And you’re gonna have a collector base there 0:28:59 that is hungry for that kind of works. 0:29:01 It don’t feel like it’s just gone. 0:29:06 But the capabilities that you will have as an artist, 0:29:07 a trained artist, let’s say, 0:29:09 when that’s devoted a lot of time to skill, 0:29:13 the capabilities that you’ll have with a tool like this, 0:29:15 working in collaboration with you, 0:29:19 are so far beyond what either of you could have alone 0:29:23 and so far beyond what less creative people 0:29:26 or non-artists, non-trained artists could create. 0:29:29 So I would say just think of it as a way 0:29:33 to open up new levels that we all can reach 0:29:36 as opposed to something that’s pushing you out. 0:29:37 Also, it’ll make you feel free 0:29:38 like when you were a little kid again, 0:29:41 when you were making art in kindergarten 0:29:43 and not judging yourself. 0:29:45 Comparison is the thief of joy kind of thing. 0:29:46 This takes that away. 0:29:48 Give it a shot and see how you feel about it. 0:29:50 – Claire Silver, thank you for joining us. 0:29:52 – Thank you. 0:29:55 (upbeat music) 0:29:58 – So Claire says AI isn’t a good or bad thing. 0:30:00 It just is. 0:30:02 But talking to her, it seems pretty clear to me 0:30:05 that she’s totally on board the hype train. 0:30:08 Unlike a lot of artists who legitimately worry 0:30:10 about being replaced by AI, 0:30:14 Claire Silver sees AI as the ultimate artistic partner. 0:30:16 It’s a collaborator that doesn’t care 0:30:20 about your technical skills or formal training. 0:30:24 All you need are good ideas and good taste to be an artist. 0:30:28 Now that doesn’t mean art’s gonna be easy to make now. 0:30:29 There’s gonna be a constant push 0:30:31 for artists to reinvent themselves 0:30:32 and come up with something novel, 0:30:35 something AI can’t just turn out on demand. 0:30:37 But Claire sees that as a good thing. 0:30:40 We’ll get to explore uncharted artistic ground. 0:30:43 And that’s what art’s all about, right? 0:30:46 And look, I get that this isn’t for everyone. 0:30:48 And I’m worried about people getting ripped off 0:30:50 and their art getting devalued. 0:30:52 It’s definitely not working for everyone 0:30:54 the way it has for Claire Silver. 0:30:58 So like Claire, I’m not gonna declare AI as good or bad. 0:31:02 But here’s one thing I feel pretty confident about. 0:31:05 AI art is art. 0:31:10 The TED AI show is a part of the TED Audio Collective 0:31:13 and is produced by TED with Cosmic Standard. 0:31:16 Our producers are Ella Fetter and Sarah McCrae. 0:31:20 Our editors are Ben Bencheng and Alejandra Salazar. 0:31:22 Our showrunner is Ivana Tucker 0:31:26 and our associate producer is Ben Montoya. 0:31:28 Our engineer is Asia Pilar Simpson. 0:31:31 Our technical director is Jacob Winnick 0:31:34 and our executive producer is Eliza Smith. 0:31:37 Our fact-checker is Christian Apartha. 0:31:39 And I’m your host, Bilal Vosidou. 0:31:41 See y’all in the next one. 0:31:43 (upbeat music) 0:31:46 (upbeat music) 0:31:48 (upbeat music) 0:31:49 (upbeat music) 0:31:59 [BLANK_AUDIO]
Our friends over at the TED AI show are in our feed today to discuss howthe development of AI is begging us to ask: what counts as art? In this provocative conversation, Claire Silver, an anonymous AI collaborative artist, sits down with host and previous guest Bilawal Sidhu to talk about how AI has revolutionized her own mixed media practice, and why she thinks that AI may be an inextricable part of human creativity in the near future.You can listen to more episodes of the TED AI show on your favorite podcast app or on their website at https://www.ted.com/podcasts/the-ted-ai-show
AI Agents Are About to Change Everything (Here’s Why)
AI transcript
0:00:04 The day that we’re recording this has been one of the craziest weeks in the world of AI. 0:00:07 It’s crazy what these tools can do. 0:00:11 I think by this time next year, it’s not going to be about like the single AI agent that you’re using. 0:00:15 It’s going to be like, what is your team of AIs that you manage and you’re the CEO of? 0:00:19 What is your team doing? 0:00:21 Hey, welcome to the Next Way podcast. 0:00:21 I’m Matt Wolf. 0:00:23 I’m here with Nathan Lanz. 0:00:26 And today we’ve got a third co-host. 0:00:29 Today we’re chatting with Don Allen Stevenson, the third. 0:00:31 He worked at Dreamworks. 0:00:36 He’s in tight with Meta, been featured on stage at Meta, works with OpenAI. 0:00:39 Given Ted Talks, he’s an author of the book Make a Seat. 0:00:43 He’s done some amazing stuff in the AI world. 0:00:45 He’s basically an AI wizard. 0:00:51 And today he’s going to chat with us and break down some of his AI wizard spells on this show. 0:00:52 We’re going to talk AI agents. 0:00:54 We’re going to talk AI workflows. 0:00:56 We’re going to break it all down. 0:01:02 Let’s just go ahead and jump right in with Don Allen Stevenson. 0:01:06 When all your marketing team does is put out fires, they burn out. 0:01:10 But with HubSpot, they can achieve their best results without the stress. 0:01:16 Tap into HubSpot’s collection of AI tools, breeze to pinpoint leads, capture attention, 0:01:19 and access all your data in one place. 0:01:22 Keep your marketers cool and your campaign results hotter than ever. 0:01:31 Visit hubspot.com/marketers to learn more. 0:01:34 So really excited to dig in and thanks so much for hanging out with us today. 0:01:39 You’re out in San Francisco at the Masters of Scale summit. 0:01:41 So yeah, this is the first. 0:01:45 This is the first time we’ve chatted with somebody while they’re hanging out outside 0:01:47 of a conference. 0:01:50 First time we’ve had a conversation with somebody hanging out outside. 0:01:54 So kind of a different fun feel to it, but how are you doing? 0:01:56 Oh, I’m doing great. 0:01:57 Yeah. 0:01:58 It’s a really fun atmosphere right now. 0:02:02 They have a lot of people in different tech spaces coming together. 0:02:06 Read Hoffman hosts this every couple of years to kind of bring a lot of minds together and 0:02:07 talk about stuff. 0:02:11 And so, yeah, I’m here to take notes and observe and report stuff online. 0:02:12 Amazing. 0:02:13 Amazing. 0:02:18 Well, we’re going to get into a lot of the sort of current happenings in the AI world. 0:02:23 The day that we’re recording this has been one of the craziest weeks in the world of AI 0:02:29 and thropic released autonomous agent functionality mid journey release some stuff, runway release 0:02:34 some stuff, ideogram, mid journey, all of these companies drop some really, really cool 0:02:35 new features. 0:02:36 We’re going to get into all of that. 0:02:40 But I think before we do, let’s let’s kind of get to know you a little bit. 0:02:44 I know your background is in visual effects at Dreamworks. 0:02:46 Would you mind breaking it down real quick? 0:02:50 You know, how what were you doing before this and how you got to doing what you’re doing 0:02:51 right now? 0:02:52 Yeah. 0:02:57 So before I was doing all the independent content creation stuff that I do now, I was 0:02:59 a teacher, a specialist trainer at Dreamworks. 0:03:02 So I taught all of our software, worked on how to train your dragon and boss baby and 0:03:03 trolls. 0:03:08 And my job was to teach our artists how to leverage the both the proprietary and the 0:03:11 third party software that we used to make the films. 0:03:12 I loved it. 0:03:18 I decided to resign because I wasn’t too thrilled with some of the directions, some of the technology 0:03:19 was going internally. 0:03:25 And I thought it might be easier to innovate and future proof outside of the studio. 0:03:32 I still love my friends and family at Dreamworks, but I found that I was easier to do some innovation 0:03:33 stuff outside of that. 0:03:38 And yeah, now I do a lot of content creation, writing books, use AI, I taught and trained 0:03:44 in AI to interview me and then had that help me write some books, do a masterclass on creativity 0:03:46 and AI and then consulting. 0:03:47 I don’t know. 0:03:48 It just kind of changes every day. 0:03:49 It just depends on who’s who’s asking. 0:03:50 I’ll just change the hat. 0:03:56 So some of some of my favorite stuff that I’ve seen from you on Instagram is when you kind 0:03:58 of get like theoretical, right? 0:04:01 You start using your phone and going, this is what could be, right? 0:04:05 Like you’ll be walking around in a grocery store and maybe you pick up a box of cereal 0:04:09 and point your phone, or looking at it with your glasses or something, right? 0:04:12 And it’ll start pulling up all this information and stuff flashing on the screen. 0:04:17 And from what we’ve seen in the world of AI, it looks like stuff that we might have right 0:04:21 now, but it’s all sort of conceptual and theoretical and you’re playing with it. 0:04:24 I just love those videos that you make on Instagram. 0:04:28 I’m curious, where like, where does, where does that like inspiration come from? 0:04:32 Like where do all these like really cool sort of futuristic ideas that you shoot videos around 0:04:33 come from? 0:04:34 Yeah, believe it or not. 0:04:40 And a lot of that inspiration comes out of a response of the show Black Mirror. 0:04:45 I was watching the show and I, you know, I like it, but at the same time, I’m just noticing 0:04:50 that it’s been giving people very little hope for the future and I’m kind of tired of that. 0:04:56 So I started that series called Clear Mirror to do the opposite of Black Mirror. 0:04:59 And instead of calling it white, I call it Clear Mirror because I want to have a more 0:05:02 transparent relationship with technology. 0:05:07 It’s not just a dark opaque system that you can’t look into or assess or observe. 0:05:11 And so yeah, I really just wanted to have more positive use cases, more positive stories 0:05:12 to tell. 0:05:15 And the, the result is these fun little videos out. 0:05:20 I often shoot them with these glasses and then add the effects onto them afterwards. 0:05:23 Cause I, to your same point, I think we’re going to have a lot of that stuff. 0:05:27 Like now, you know, like it’s not, they’re not that far, they used to be so far in the 0:05:28 future. 0:05:30 People were like, Oh yeah, that’s, that’s pretty normal. 0:05:33 I’m like, wow, damn, I’m zipping by. 0:05:34 Yeah. 0:05:35 Yeah. 0:05:38 Sometimes I’ll just scroll by the video and I’ll just see it on, on mute, right? 0:05:39 Like I’ll just be scrolling Instagram. 0:05:40 I’ll see it on mute. 0:05:43 And at first I’m usually thinking that these days, oh, that’s a real thing. 0:05:44 That’s, that obviously exists. 0:05:48 And then I’ll watch the video and go, Oh no, this is one of those like concepts that he’s 0:05:49 playing with. 0:05:55 But I mean, it feels like the, the, the really getting closer and closer together, right? 0:05:59 Those like theoretical sci-fi things are starting to feel more and more real. 0:06:03 And it’s hard, harder and harder to tell, like, is, is that actually something we can 0:06:04 do? 0:06:08 Because I mean, looking at those Orion glasses that we got to see, you know, a few weeks 0:06:14 ago, a lot of this stuff you were showing off, like those glasses do, like it’s pretty 0:06:15 crazy. 0:06:16 100%. 0:06:17 Yeah. 0:06:21 I mean, like on the Orion’s, like, you know, they got down the form factor that they, they 0:06:22 track well. 0:06:28 And you can have, you can talk to AI characters, you can talk to in-person avatars that are 0:06:31 both either a stylized version or a photorealistic version. 0:06:34 So it’s like, oh, that, that exists now. 0:06:36 It’s not, that’s not sci-fi. 0:06:37 That’s a real thing. 0:06:41 Whether it’s, you know, ready for consumer adoption still, still to come. 0:06:44 But at the same time, it’s not, it’s not fantasy anymore. 0:06:45 Yeah. 0:06:46 Yeah. 0:06:48 So anybody that’s like listening to the audio that’s might not be watching the video, you’re 0:06:51 wearing the Meta Ray-Ban sunglasses. 0:06:53 We’ve talked about them a lot on this, on this show. 0:06:57 I’m, I’m a fan of them, but you mentioned that you managed to like hack them and run 0:06:58 chat GPT. 0:06:59 I’m curious. 0:07:02 Is that like, is that a really hard process to do? 0:07:03 Is that something you could share? 0:07:04 Like, how does that actually work? 0:07:05 Yeah. 0:07:07 So I, full disclosure, I told Meta about this. 0:07:10 So they know that that, and I was like, Hey, what are you going to do about it? 0:07:13 And they’re like, Oh, they actually thought it was kind of cool, but maybe, you know, 0:07:16 they didn’t like, they’re like, why not just use their AI? 0:07:18 But yeah, in general, it is pretty friendly to do. 0:07:20 I can’t code at all. 0:07:24 So how I did it was, I’ve had it for about a year, by the way. 0:07:29 And I got chat, I just asked chat GPT how to do it. 0:07:35 And chat GPT four, not even 4.0, not even the reasoning, not the 0.1, open app, it’s 0:07:38 not, it was the old one, chat GPT four. 0:07:45 And what I asked it was like, Hey, I have an API key to, you know, chat GPT with voice. 0:07:50 And I have the Apple Siri shortcuts built onto my iPhone. 0:07:54 Is there a way that you can give me step by step instructions that would allow me, someone 0:08:01 who cannot code, to understand how to plug and play in Apple Siri shortcuts to get a 0:08:05 thing where if I hit the action button on my Apple watch, or if I hit the action button 0:08:13 on the iPhone 15 or 16 to trigger voice back and forth conversational mode with chat GPT, 0:08:16 but on the Ray-Ban glasses, I would say it’s not too hard to set up. 0:08:19 But maybe it’s a few weird steps. 0:08:21 It’s kind of like built in now. 0:08:24 So I think now you probably don’t even have to do the whole shortcuts thing. 0:08:31 I think chat GPT recently released a widget that does that and I was like, Oh my God. 0:08:35 But before the widget, you could do it with the action button and then, yeah, triggers 0:08:39 the voice mode and the main use case was just more conversational, you know, you can talk 0:08:42 back and forth and have a, and you can interrupt it. 0:08:45 And it was when it had the model that sounded kind of like Scarlett Johansson is when I was 0:08:46 using it. 0:08:47 Right. 0:08:48 I loved having that. 0:08:49 Wow. 0:08:50 Her is real. 0:08:53 I have Samantha on the glasses. 0:08:58 And then they, you know, canceled the sound alike person and then now I lost that voice. 0:09:02 But hey, it was great while I lasted. 0:09:03 Which voice are you using now? 0:09:04 The British sounding one. 0:09:05 I thought was kind of fun. 0:09:07 I don’t know the character’s name. 0:09:08 I went school. 0:09:09 Yeah. 0:09:10 I know what you’re talking about. 0:09:11 That’s awesome. 0:09:12 I appreciate you sharing that with us. 0:09:16 I mean, it’s crazy what these tools can do like Claude and chat GPT where you can just 0:09:22 like, I, I built a whole video game using chat GPT once just going back and forth saying, 0:09:24 um, this is what I’m trying to make. 0:09:27 And then when there was an error, I would just go back and say, I don’t know how to 0:09:28 code. 0:09:29 So what’s this error and how do I fix it? 0:09:30 They would tell me how to fix it. 0:09:36 And then I mean, it, it took me several hours to get to where I want to go. 0:09:40 But still several hours to develop a game is a lot better than the old fashioned way. 0:09:41 100%. 0:09:43 And do you have a coding background? 0:09:44 I don’t. 0:09:47 I mean, I know how, I know how to do HTML and build websites, but that’s about the 0:09:48 extent of it. 0:09:49 Yeah. 0:09:50 I mean, that says everything right there. 0:09:53 It’s like, you can have a game in a couple hours with today’s tools. 0:09:57 That was not a normal thing, even like six months ago. 0:10:01 Well, you know, speaking of, of Claude, they just rolled out a new feature. 0:10:05 And now I just want to kind of get into like all of these new stories that came out. 0:10:08 We can all just sort of like riff on our thoughts and where this is all headed. 0:10:13 But you know, as of this recording, Claude just released a brand new feature called, 0:10:18 I think they call it computer use, which is not my favorite, like naming convention for 0:10:19 it. 0:10:22 But like, Anthropic just released computer use, right? 0:10:26 And I actually went and tested it yesterday and it was pretty cool. 0:10:27 It ran into some issues. 0:10:30 I ran into like rate limit issues where it would go through a whole bunch of steps and 0:10:35 then it would say, oh, you’ve run into a rate limit and it wouldn’t continue for me. 0:10:39 And I ran into some stuff like that, but it was really, really interesting to actually 0:10:45 watch it go and like open up Firefox for you, go move the mouse to the command bar, type 0:10:47 in a search for you. 0:10:51 And then once it searches, basically I gave it the prompt, go to Matt Wolf’s YouTube 0:10:56 channel, find the top five most popular videos and tell me how long they were all published, 0:10:59 how long they were all published and then add them to a spreadsheet for me. 0:11:00 Wow. 0:11:04 And it actually managed to go through every single one of the steps, go to my YouTube, 0:11:09 click on popular, sort by popular, grab the title, copy, paste it into a spreadsheet and 0:11:13 it filled out a spreadsheet of the top five most popular videos. 0:11:17 Now if I’m being honest, I could have done that process myself, you know, four times 0:11:22 as fast, but the implications I think are really, really cool that I could just give 0:11:23 it a command. 0:11:24 This is what I need you to do. 0:11:28 Now go off and do it and I could walk away from my computer and it’ll just go through 0:11:31 all the steps until it completes the thing. 0:11:33 But yeah, pretty crazy. 0:11:35 Have you played with it at all yet? 0:11:36 I actually haven’t. 0:11:37 There’s been so much stuff. 0:11:40 I’ve been reviewing it and watching videos on it. 0:11:43 But I mean, I think by this time next year, it’s not going to be about like the single 0:11:45 AI agent that you’re using. 0:11:49 It’s going to be like, what is your team of AIs that you manage and you’re the CEO of? 0:11:50 What is your team doing? 0:11:55 And you’re going to be like, oh, I’m Matt and my team does like all these things. 0:11:57 You’ll have like all these agentic properties. 0:11:58 I’m excited for that. 0:12:02 It’s going to get weird though because we’ll be like, wow, remember when it was so slow 0:12:05 way back in 2024, things were so slow. 0:12:07 All pre-agent days. 0:12:08 Yeah. 0:12:09 You had to do work on your computer. 0:12:12 You actually had to use the computer to do work. 0:12:13 Yeah. 0:12:17 I know this might seem like an obscure reference, but you know, Wally, did you watch the Wally 0:12:18 movie? 0:12:21 Do you remember the captain in the movie Wally? 0:12:22 Yeah. 0:12:23 Yeah. 0:12:24 Yes. 0:12:25 Yes. 0:12:29 That gentleman, his main issue was that he was like the only person on earth of the 0:12:34 humans that could still read and he had the reading level of maybe like a fifth or a four 0:12:41 or five year old and it just maybe think like maybe down the road in a more negative light. 0:12:46 We might be returning to that where we’re like, we lose some of our, you know, if we 0:12:51 have so many AI agents doing every single task, we might forget some of that basic stuff 0:12:55 like, oh, remember when I have to like, he tries to open up a book by voice commanding 0:12:56 it. 0:12:57 Yeah. 0:12:58 He’s like, open. 0:13:00 And then he’s like, look at it. 0:13:01 That’s a black mirror. 0:13:02 Right. 0:13:03 That’s a black mirror. 0:13:04 Yeah. 0:13:05 Stay on the clear mirror. 0:13:06 You’re right. 0:13:07 Sorry. 0:13:08 My apologies. 0:13:09 Yeah. 0:13:10 Yeah. 0:13:11 I mean, I feel like this technology too, though, could like be teaching everyone to 0:13:12 read better and things like that. 0:13:16 Like, you know, making less time on the computer and more time in the real world, the physical 0:13:17 world as well. 0:13:18 So. 0:13:19 Yeah. 0:13:20 I feel like there’s like almost two potential paths, right? 0:13:25 There’s going to be people who use AI and they get way lazier as a result, right? 0:13:27 They’re like, oh, this does all my work for me. 0:13:29 I’m just going to let it do everything. 0:13:32 I’m going to go and smoke weed and play video games, right? 0:13:37 Like there’s going to be that sort of sector of people, but then the way, like ever since 0:13:43 AI has sort of bubbled up over the last three years, for me, it’s made me go way deeper 0:13:44 on stuff, right? 0:13:46 Like it’s made me go, oh, this is really interesting. 0:13:47 I want to dive deeper. 0:13:52 I’m going to go to perplexity and have perplexity, dig into all these resources for me and do 0:13:54 some research and I’m going to learn more about it. 0:13:58 I’m going to go to archive.org and some of these like white papers that these people 0:14:03 put out that I tried to read in the past, but like once it starts putting like letters 0:14:07 into the algorithms and different symbols that I don’t even recognize, these papers 0:14:11 are over my head, but I can now pull them into notebook LM and have notebook LM give 0:14:13 me a podcast that explains it to me. 0:14:14 Perfect. 0:14:17 So I think I feel like there’s those two potential paths that I’m going to get way 0:14:25 lazier as a result, or I’m going to use this to really up level, you know. 0:14:28 We’ll be right back, but first I want to tell you about another great podcast you’re 0:14:29 going to want to listen to. 0:14:34 It’s called Science of Scaling hosted by Mark Roberge, and it’s brought to you by the 0:14:39 HubSpot Podcast Network, the audio destination for business professionals. 0:14:44 Each week hosts Mark Roberge, founding chief revenue officer at HubSpot, senior lecturer 0:14:49 at Harvard Business School, and co-founder of Stage 2 Capital, sits down with the most 0:14:54 successful sales leaders in tech to learn the secrets, strategies, and tactics to scaling 0:14:56 your company’s growth. 0:15:01 He recently did a great episode called How Do You Solve for a Siloed Marketing and Sales, 0:15:03 and I personally learned a lot from it. 0:15:05 You’re going to want to check out the podcast. 0:15:09 Listen to Science of Scaling wherever you get your podcasts. 0:15:15 Thank you for that kind of remind me of the queer mirror. 0:15:16 You’re right. 0:15:19 Like, I will not go down that route here. 0:15:20 It’s very positive. 0:15:24 You know, you can have, like, I would love to have an AI agent that I build, like one 0:15:29 the person I’m going to try to do with computer use, is get a little AI to be my professional 0:15:34 critic and email me how it feels about my content. 0:15:38 I’ve been training a critic based off of all the best critical feedback I’ve gotten 0:15:39 online over the years. 0:15:43 I’ve been collecting it anyway on Google Doc, and they’re all feedback, so it’s not 0:15:48 like haters, but like comments that hurt me because they were right. 0:15:49 Yeah. 0:15:50 Yeah. 0:15:52 And I was like, oh, dang. 0:15:57 So, yeah, like, I’m going to basically, I would love to have Claude use periodically 0:16:03 check my content and then inform me on things like, “Hey, you’re not this about this.” 0:16:07 You know, just get like that nice formal critique where it’s like a thoughtful, actionable thing. 0:16:08 Yeah. 0:16:12 When it comes to these AI agents, you know, I want to go back to something you said where 0:16:17 you’re almost like the CEO and you have like a bunch of agents underneath you, and I love 0:16:20 that idea of like from a YouTuber perspective, right? 0:16:24 Like I would love to be able to use one of these agents and say, all right, here’s the 0:16:27 transcript from a video I’m about to publish. 0:16:32 Take this transcript and, you know, write a title for me, but to write a title, go do 0:16:36 some research on YouTube and find out what style of title is working really, really well 0:16:39 right now based on what you find. 0:16:41 Give me 10 potential titles. 0:16:46 That’s AI agent number one, AI agent number two, I need a good thumbnail for this video. 0:16:50 Go look on YouTube, find the thumbnails that are performing the best, take some screenshots 0:16:55 of them and analyze what works really well for thumbnails right now. 0:16:59 Come back, give me some ideas for a thumbnail, give me like 10 ideas. 0:17:00 I’ll pick from those 10. 0:17:04 And then once I pick one, you go and make that thumbnail for me, right? 0:17:08 And now it’s just like, I made the video, I give it the transcript, and it, you know, 0:17:13 like I’m the producer role and all the little roles under me know exactly what they’re supposed 0:17:16 to go do to complete the rest of the process. 0:17:21 And that to me is like such an exciting world because it’s like, it’s not really taking 0:17:25 away a lot of the creative tasks that I enjoy doing, it’s taking away more of the monotonous 0:17:27 tasks that I don’t enjoy doing, you know? 0:17:31 I feel like this world has been coming for a while and it’s refreshing. 0:17:36 Like any of you Harry Potter fans or watch any of the Harry Potter movies? 0:17:37 Oh yeah. 0:17:42 So do you remember in the last movie when Dumbledore walks up to all the paintings and gives them 0:17:46 a series of complicated tasks to secure the castle? 0:17:51 And then the character in the painting left the frame, went into the wall and basically 0:17:57 did autonomous things around the castle, like setting up shields, notifying the white people. 0:17:59 That’s what I think we’re going to be all having. 0:18:04 Like that fantasy, that magic of talk to the painting, the painting will go do something 0:18:09 on your behalf, that’s like here now and I’m excited for everyone to have magic. 0:18:10 Yeah. 0:18:12 That’s such a good analogy. 0:18:13 My wife’s a huge Harry Potter fan. 0:18:14 I’ve seen them all. 0:18:17 I wouldn’t say that I’m like the biggest Harry Potter fan, but I’ve seen them all because 0:18:20 my wife and kids all watch them. 0:18:24 So I’ve seen them all at least three times, but no, that’s a great little analogy, a great 0:18:27 picture of what’s happening here. 0:18:31 And I mean, to me, that’s a super exciting world and I think it’s going to get to a point 0:18:38 too where you have the agent that is also like the CEO or producer who is like going 0:18:41 and telling each of these other agents to go do their role, right? 0:18:44 I think you’re going to have like multiple levels, right? 0:18:49 Where you just train your CEO and then the CEO agent goes and tells the other agents 0:18:50 what to do. 0:18:51 Yeah. 0:18:53 It’s like sci-fi and fantasy have like combined now. 0:18:56 Like these were things that were really sci-fi. 0:19:00 Like remember that magical feeling of like seeing a painting talk and interact and like 0:19:05 the fact that it could hear you, it could hear the new characters, the fictional characters. 0:19:06 That’s like, we have that now. 0:19:11 And if we add, you know, some of the other tools I know we’ll talk about like AI studio 0:19:16 or sorry, not AI, Act One, you can almost build that actual painting character and it’s 0:19:18 actually, it would look that way. 0:19:20 So we can take it the whole mile. 0:19:21 Yeah. 0:19:22 Yeah. 0:19:23 Yeah. 0:19:24 It’s kind of shocking to me like how fast humans adjust to all of this. 0:19:26 Like this stuff is purely magic. 0:19:30 Like if you really think about, if you step back and look at it, it is magical and people 0:19:31 just get used to it. 0:19:33 Like after a day, it’s like, yeah, of course it does that. 0:19:34 Whoa. 0:19:35 Right. 0:19:36 What do you know? 0:19:37 It’s like, what? 0:19:38 Yeah. 0:19:39 It’s funny. 0:19:40 It’s funny you say that. 0:19:41 Cause I, I recent, I mean, I actually read it several years ago, but I recently re-read 0:19:42 it. 0:19:44 There’s a obscure book called “Off to Be the Wizard”. 0:19:50 I don’t know if any of you have ever heard of it, but it’s about this kid who, he opens 0:19:56 up his computer one day and he finds this like mysterious file on his computer and he 0:20:01 starts looking through this file and realizes that he can like tweak things and it actually 0:20:03 tweak things in real life. 0:20:04 Right. 0:20:08 So he would like, he found like his bank account and the number that was in his bank account 0:20:11 and he added an extra zero and then he logged into his bank account and there was extra 0:20:12 zero there. 0:20:13 Right. 0:20:17 And so he figured out how to like tweak the real world by tweaking the code. 0:20:21 But then what he, but then like the, the feds caught onto him and said, this guy’s obviously 0:20:22 doing something illegal. 0:20:23 Right. 0:20:24 How does he have all this money? 0:20:27 Like, I don’t know how he did it, but there’s something weird going on. 0:20:30 And so he decides, I’m going to travel back in time. 0:20:35 So he goes back in time to the medieval age, but he goes back with his iPhone and all 0:20:40 of his, his computer devices and all that kind of stuff and he convinces everybody in 0:20:46 medieval times that he’s a wizard, but all he’s really doing is using today’s technology, 0:20:47 he has it back in medieval time. 0:20:49 And that’s the whole premise of the book. 0:20:54 And it’s amazing book, such a like a fun, fun read, but it’s like, that’s what, what 0:20:55 we’re seeing right now. 0:21:02 It’s like, if you went back a hundred years ago and started having a conversation with 0:21:04 them, people would think you were a freaking wizard. 0:21:05 Sorcery. 0:21:06 Yeah. 0:21:09 Like what power that they have. 0:21:15 Um, I feel like we’ve been using technology to kind of anapomorphize and create almost 0:21:16 every Greek God. 0:21:18 Do you remember Hermes? 0:21:25 He was the messenger God and his magical power was like sending a message instantly across 0:21:26 vast distances. 0:21:32 Now it’s like, you get a phone call and you’re like, I don’t want to answer it. 0:21:37 That was a God like power in earlier, you know, belief systems. 0:21:41 Like I just think that’s so interesting to your point that you made, you know, Nathan 0:21:44 around how quickly we get used to things. 0:21:47 That was, that was the land of gods and goddesses. 0:21:50 And then now it’s like the climbable. 0:21:51 Yeah. 0:21:52 Yeah. 0:21:57 And one thing I see coming too is like the, these sort of multi-step tool use things are 0:22:01 probably coming to like the Meta Ray Bands pretty soon as well, right? 0:22:02 Like it’s probably only a matter of time. 0:22:05 I mean, you probably even have more insight than we do on this, but it’s probably only 0:22:09 a matter of time where you just like look at something with your glasses and then say, 0:22:10 oh, that’s a cool backpack. 0:22:13 Go buy that for me on Amazon, right? 0:22:17 And then like, you know, you get an email confirming that this was purchased on Amazon 0:22:19 and it’s sitting on your door the next day, right? 0:22:23 Like combine the glasses with the ability to go do the shopping for you. 0:22:25 And it’s like anything you see in the real world. 0:22:26 I want that. 0:22:27 Okay. 0:22:28 I’ll go get it for you. 0:22:29 Right. 0:22:30 Like that’s coming as well. 0:22:33 I think I even seen like a, well, I haven’t seen a live example of this, but it’s a theory 0:22:40 I have that a lot of the next generation of ads are going to be for AIs to see. 0:22:46 And then that AI knows you’ve given it some allowance, some permissions, even like an 0:22:52 allowance like, hey, you’re allowed to spend up to this amount per month on what you know 0:22:54 about me, what I’ve approved. 0:23:00 And then the ad that that AI is going to see is going to hit your AI that’s trained on 0:23:03 your preferences and it will go make that purchase. 0:23:05 And then it just seems like it magically appears on your doorstep. 0:23:06 Yep. 0:23:07 Yep. 0:23:08 I could see that. 0:23:10 Well, let’s talk about some of this other really cool stuff because you obviously have 0:23:14 a background in, in like visual effects, dream works and stuff like that. 0:23:18 And we just got to see this runway take one, which if you’re listening and you haven’t 0:23:23 heard of runway take one, it’s this new feature, I believe they’re rolling it out. 0:23:24 I don’t have it in my account yet. 0:23:28 I believe it’s like rolling out fairly soon though, but it’s a feature where you can take 0:23:36 a video and sort of record yourself on video, feed it to this runway act one and then it 0:23:40 will make like a cartoon animation and it will be lipstick to you and it will follow 0:23:42 you the same emotion. 0:23:46 So like if you look happy, if you’re laughing, if you’re sad, if you’re angry, those emotions 0:23:51 theoretically will show up on the face of this cartoon character based on the emotions 0:23:54 that you had in your original video. 0:23:58 And I’m just curious, your thoughts because I know, you know, coming from a background 0:24:04 of like visual effects and Hollywood and all of that kind of stuff, I know there’s like 0:24:06 fear around a lot of these tools too. 0:24:08 So like what are your initial thoughts on it? 0:24:13 The people that you do know that are working in video and Hollywood, like what is the general 0:24:14 consensus? 0:24:15 Are people excited about stuff like this? 0:24:17 Are people really fearful about stuff like this? 0:24:19 Like where do your thoughts lie with this kind of thing? 0:24:23 Yeah, so it really comes down to two words, generalists versus specialists. 0:24:25 My generalist friends are thrilled. 0:24:27 They’re like, “This is so exciting. 0:24:28 This is all great.” 0:24:32 Like they’re already generalists and this is just another tool that they can add that 0:24:34 will enhance what they’re doing. 0:24:41 My specialist friends, specifically the ones I trained, aren’t thrilled because their specialist 0:24:44 skills were like facial rigging. 0:24:49 I got the whole job at DreamWorks and Pixar and Illumination Films, some of you rigged 0:24:54 a 3D puppet to allow it to emote and to create all those faces. 0:24:59 So like if you’re a specialist in the animation and film world right now, you’re not thrilled 0:25:00 with a lot of these AI developments. 0:25:03 In fact, you’re scared and angry. 0:25:06 But if you’re a generalist, you’re like, “Oh my goodness.” 0:25:10 And then another framework is size of studio. 0:25:16 So the larger studios are more specialist and the smaller studios have more generalists 0:25:17 as artists. 0:25:24 So you’re going to see smaller studios, smaller animation films, teams, companies, small studios 0:25:28 are going to adopt a lot of these tools because they already are wearing all the hats. 0:25:33 The larger studios are not going to want to adopt these tools because they have mostly 0:25:34 specialists. 0:25:38 So they don’t want to scare all their talent away. 0:25:40 They will replace their talent when they can. 0:25:41 Yeah. 0:25:46 So, I mean, being somebody that you’ve kind of had your foot in both worlds like the video 0:25:51 production world as well as the AI world, like do you have advice for the people that 0:25:52 are scared? 0:25:55 Like what kind of stuff do you tell the ones that are sort of freaking out, maybe more 0:25:56 of the specialists? 0:25:57 Yeah. 0:26:02 Right now the main advice I give to like my fellow specialist friends that have deep skills 0:26:06 is they have to start actually using these tools. 0:26:11 They can’t just order anymore unless they only want their skill to be a hobby. 0:26:15 If they just want to be a hobby, then don’t change a thing. 0:26:21 But if their idea is you want to also have a career in this thing, you cannot roll your 0:26:24 eyes at the latest AI update anymore. 0:26:28 That’s what’s been my advice. 0:26:34 The other one is they could double down or triple down on purely human-generated content. 0:26:39 They can go like the craftsman, artisanal, Etsy. 0:26:43 I also recommend that to some artists that are really anti the AI stuff. 0:26:47 If they go down that route, they have to basically increase the prices dramatically. 0:26:49 It has to be a luxury product. 0:26:53 If their idea is like, “Oh, I want to make my artwork and my film really accessible to 0:26:57 lots of people and I’m going to make it from scratch by the act,” the only people that 0:27:03 I believe are going to be able to afford that are high-end luxury clients. 0:27:08 So it’s like if they pick up a general audience, they might have to use AI tools now and they 0:27:10 can just be like, “Oh, no.” 0:27:11 Yeah. 0:27:15 From what you’ve seen, how close do you think we are to these tools being able to completely 0:27:17 replace riggers and things like that? 0:27:24 It depends because I used to think that there was a threshold of quality that was required 0:27:25 to be met. 0:27:30 I don’t believe in that anymore because I’ve learned that quality content is relative. 0:27:39 There’s some people who find that Skippity Poop need or animation, there’s a certain 0:27:44 audience that actually finds that to be quality animation and quality content. 0:27:46 The rigs on those things are terrible. 0:27:51 The rigs, the way that the mouth blends and the rig is terrible. 0:27:55 If you showed that to an artist at Pixar or Disney or DreamWorks, they would say, “That’s 0:27:56 a terrible rig. 0:27:58 No one’s going to watch that,” but then they do. 0:28:03 So I think you can use a lot of these tools today, like a creative person can use these 0:28:09 tools to tell stories and provide value today because I think value is subjective. 0:28:15 I get a lot of anger online from saying that some of my more film and specialist friends 0:28:21 want to say that value is not relative, that there’s a difference between quality content. 0:28:24 I kind of disagree with that, unfortunately. 0:28:25 Yeah. 0:28:26 I can see that. 0:28:32 I mean, I look back like 20 years ago and the videos that were going viral 20 years ago 0:28:39 on the internet were silly stick figure videos of stick figure death videos of stick figures 0:28:44 falling off cliffs and then sort of splatting and had sort of cartoony blood and stuff like 0:28:45 that. 0:28:49 But they were the most basic looking animations and those were the videos that everybody was 0:28:52 looking at that were going viral because they were humorous. 0:28:58 They had some sort of story and there was comedy and the sort of visual aspect of it 0:29:01 was sort of the last priority of it. 0:29:05 But people loved that stuff and they would go viral and there’s examples of stuff like 0:29:06 that today. 0:29:09 You can find YouTube channels that are, I mean, South Park, right? 0:29:11 Look at South Park. 0:29:16 It’s like fairly crude looking construction paper animation and it’s still one of the 0:29:18 most popular comedy shows today. 0:29:19 That’s an ideal example. 0:29:20 Yeah. 0:29:24 I mean, just to add to that, same point, the type of animation that we do at a lot of 0:29:29 animation studios called an animatic and for those that are half familiar, it’s a really 0:29:34 rough pass of an animation that gets its story down, but the actual display and the actual 0:29:41 audio that you’re hearing is usually temporary or in flux or being changed out a lot. 0:29:46 And my proof that I hope more big studios actually listen to is that we would do audience 0:29:52 testing for films with just the animatics where basically no shot is done and they’re 0:29:53 all at different stages. 0:29:58 Some stuff is fully rendered in 3D, some stuff is just hand sketched, some stuff is basically 0:30:04 an early comic, but when you see the audience reacting to it, they love the story, they’re 0:30:06 laughing at the right times, they’re responding. 0:30:14 I wish more DreamWorks artists, Disney artists in film, TV in general, film in general, acknowledged 0:30:18 an animatic can actually work on its own. 0:30:24 There’s certain situations where you might not need to spend $300 million on rendering 0:30:28 to tell that same story that the audience loved. 0:30:32 They loved the story, especially when there’s a lot of talent behind it and that was very 0:30:33 eye-opening. 0:30:36 We had a whole, we bought a big elementary school once to do an early screening of How 0:30:41 to Train a Dragon 3 and it was only an animatic form and the kids loved it. 0:30:47 They were laughing and cracking up and the shots would be stick figures at times, rough 0:30:52 sketches and then other times it’d be a fully 3D rendered model and I’m sitting there thinking 0:30:56 like, “Don’t they know that it’s not done, the film’s not done yet? 0:30:58 How do they enjoy it so much?” 0:31:00 It didn’t matter. 0:31:04 It feels like all of this really levels the playing field and I think it’s going to allow 0:31:08 some of the new kinds of films to be created because I spent a little bit of time in Hollywood. 0:31:12 I was partners with Bear Osborn, the producer of Lord of the Rings and The Matrix. 0:31:16 It’s been about a year and a half trying to create a movie studio together. 0:31:20 This crazy scenario where we had a friend who was an early crypto investor and he was 0:31:24 going to help fund things and pulled together capital and so I spent a year and a half going 0:31:28 out to New Zealand, going out to Hollywood, meeting all these people and he would take 0:31:33 me on like a Disney movie set and had all these different heads of departments tell 0:31:37 me what their job was and explain everything to me and answer any questions. 0:31:41 I was just, you know, it was shocking how much money all this cost. 0:31:46 To make a film now, when they greenlight a film, it’s almost always a sequel or something 0:31:47 like that, right? 0:31:53 Or based on some existing IP that people love because it costs $300 million to make, right? 0:31:57 You can’t take any creative risk with $300 million, right? 0:31:59 Or very little. 0:32:02 And so with this kind of technology, you’ll start getting the cost of films possibly down 0:32:07 to millions or tens of millions and you can take a lot more risk and you’ll see new kinds 0:32:10 of stories emerge that would have never happened before. 0:32:14 So what I’m like, you know, as a, as a consumer, very excited about this, see like actually 0:32:15 new stuff. 0:32:18 It’s not just a sequel of a sequel or based on some book that I never read or. 0:32:19 Yeah. 0:32:20 Yeah. 0:32:21 Well, it totally democratizes. 0:32:25 I know that’s a buzzword, but it totally democratizes like video creation, right? 0:32:29 Like there’s so many people out there that are probably amazing storytellers but have 0:32:32 like no skills in visual effects, right? 0:32:38 Well, now this stuff is allowing them to at least roughly tell their story in some way, 0:32:39 right? 0:32:41 And that’s, that’s what’s exciting as well, right? 0:32:46 Because I’ve never been very technical when it comes to a lot of like, like I don’t know 0:32:48 how to use After Effects very well. 0:32:55 I don’t know how to use any of the like blender or any of the like 3D animation tools or any 0:32:56 of that kind of stuff. 0:32:58 I think I can tell good stories. 0:33:01 I think I can make decent music. 0:33:06 And if I want to put video over it, I now have options to sort of do something with the video 0:33:12 portion of it, where I never felt like I had the skills to be able to do that before. 0:33:16 You know, are my videos going to look as good as somebody who’s been doing it in Hollywood 0:33:17 for the last 30 years? 0:33:22 No, but I can, I could get something out into the world that prior I wasn’t ever able to 0:33:23 get out into the world. 0:33:25 And to me, that’s what’s most exciting about it. 0:33:29 One thing, Don, one thing I wanted to ask, and I don’t know if this is stuff that you’re 0:33:30 allowed to share. 0:33:34 So if you can’t just don’t worry about it, we’ll skip past it, but you’re one of the 0:33:41 few people that I know that actually has been able to use Sora already, but I’m curious 0:33:47 if you’re able to share any of your experiences actually getting to use Sora. 0:33:48 Nothing I can share yet. 0:33:49 Yeah. 0:33:50 No worries. 0:33:51 Had to ask. 0:33:53 When I can, I will share a lot. 0:33:54 Very cool. 0:33:55 What about Metta? 0:33:59 So we saw you and I were both at the MetaConnect event this year. 0:34:04 We both got to see the Orion glasses and all of the stuff they’ve been working on. 0:34:09 One of the coolest things I saw on Instagram was your first-person view of you walking 0:34:12 out onto stage and shaking your hands with Mark Zuckerberg. 0:34:17 I’m like, there is literally no better use case for these Metta Ray-Ban glasses than 0:34:20 showing yourself walking out on stage in front of a huge audience and shaking hands with 0:34:21 Zuckerberg. 0:34:25 Literally, that is the ultimate use case for those glasses right now. 0:34:28 But I’m curious, how did that whole thing come about? 0:34:34 I know you’re one of the first people who’s used the AI. 0:34:38 What’s the feature called where it’s like an AI version of yourself that people can 0:34:39 go and chat with? 0:34:42 You’re one of the first people that’s gotten to use that. 0:34:44 How did that all come about? 0:34:45 Yeah. 0:34:47 Two stories there. 0:34:52 The product, that AI product is called Creator AI and it’s part of AI Studio within Metta. 0:34:57 They have the ability to allow creators to build a custom AI model that’s trained on 0:35:00 your social media data that you consent to. 0:35:04 I could check a list of what things it’s allowed to learn from and train from, and then it’s 0:35:06 added into its knowledge graph. 0:35:11 Kind of similar to building a custom GPT if that GPT updated on the fly based off of how 0:35:12 I use social media. 0:35:16 If I post a thread tomorrow, that will be part of its brain. 0:35:21 If I leave a really long comment replying to an answer to a common question that’s also 0:35:26 now added to its brain, it really speaks and communicates very much like me. 0:35:33 When it came to filming with the Raymans and shaking Mark’s hand on stage, I actually 0:35:38 did not get permission to film that or to share that. 0:35:43 I decided I thought this would be a really appropriate thing, but I was nervous they 0:35:45 were going to say no if I asked. 0:35:51 I just was like, “I’m going to film this and I’m going to share it.” 0:35:54 I shared it right when I got onto a plane. 0:35:55 It was posted. 0:35:59 I could see any feedback for several hours and I was like, “I’m either going to get 0:36:04 a lot of anger when I land or a lot of joy,” and then there was a lot of joy. 0:36:06 They’re like, “This is a great use.” 0:36:13 I was like, “Yeah, the idea there was like, they talk about the advantages of that kind 0:36:15 of form factor. 0:36:19 I’ve been using their Rayman glasses I think since 2021 with their first versions that 0:36:23 came back in was called the Rayman Stories. 0:36:29 I like them because it’s a less distracted viewing experience, less distracted capturing 0:36:30 experience. 0:36:34 Believe it or not, the old versions of these glasses, when I proposed to my wife, I wore 0:36:43 the glasses while I proposed to her and got my first person view of her response saying, 0:36:44 “Yes, thank goodness.” 0:36:50 I mean, to be fair, it probably would have gone viral either way. 0:36:51 Right. 0:36:53 There’s certain moments where I’m like, “Okay, this makes a lot of sense. 0:36:56 It would have been really inappropriate for me to pull out my phone when I was proposing 0:37:01 to her and put the phone in front of us like, “What’s your answer to this?” 0:37:07 They were the transparent ones, so they weren’t the shades. 0:37:14 My eyes, we were looking at each other’s eyes and we could still capture that memory. 0:37:19 The reason why Meta reached out to me for those products is it’s those concept videos. 0:37:24 Believe it or not, I made those theoretical concept videos, I posted on Instagram of what 0:37:28 I want to see out in the world, and I try to do the opposite of the show Black Mirror. 0:37:34 A lot of people will try to tie their technologies to more dystopias, and then I’m actively not 0:37:35 doing that. 0:37:40 I’m trying to come up with creative positive uses, so they see that consistently and I’m 0:37:45 just doing it on my own that they’re like, “Wait, maybe if we support him, maybe he can 0:37:49 do this more,” and they have, so I’m like, “Oh, great. 0:37:53 I can tell more positive stories and I’ll use your tech to do it.” 0:37:54 Super cool. 0:37:57 Well, there’s one last gadget I want to ask you about because you mentioned it before 0:37:58 we hit record. 0:38:02 You’re wearing like the little blod thing, right? 0:38:06 Basically it’s a device that records all day long and keeps notes for you, right? 0:38:07 Is that what it is? 0:38:11 It’s a blod AI note pen, it has a magnetic back. 0:38:14 You can wear it like a lapel on your shirt, like a button. 0:38:18 It also comes with like a necklace, so you can wear it as a necklace, and it also has 0:38:23 like a wrist worn interface, so it looks like a watch and you can kind of wear it there. 0:38:24 And a clip. 0:38:28 They have a magnetic clip, so you can pin it onto a bag or something. 0:38:33 But yeah, what it does if it takes a listen and summarizes your day, you get a nice transcript 0:38:39 out of whatever you record, and then their tool actually then summarized it and organized 0:38:41 it based off of some templates. 0:38:49 So you can make it for meetings, make it for conferences, make it for random tasks, and 0:38:54 it’ll reorganize that transcript and give you a nice summary. 0:38:55 I love it. 0:38:57 It’s useful and they have a card version as well. 0:39:00 So if you’re just like at a conference all day long, you can just wear it. 0:39:03 It will take notes on every speaker you see. 0:39:04 It’s for conferences. 0:39:09 So right now I’m at the Masters of Scale conference, and all these speakers, I can’t actually process 0:39:11 how much information they’re sharing. 0:39:12 It’s a lie. 0:39:18 So you turn on the plot AI note pin, and I can trust that it’s going to ingest all of 0:39:21 this, and then give me a transcript. 0:39:26 And if you’re not pleased with the summary that it makes, you have that transcript. 0:39:30 Copy it, paste it into your language, you know, your large language model of choice, 0:39:35 and then say, you know, what was like the three biggest takeaways today? 0:39:40 And it does a great job at quickly and instantly giving you the three best takeaways. 0:39:45 And if you don’t like them, you can say, well, actually, any others, you know, and like you 0:39:47 can just have a regular conversation. 0:39:53 It’s like having a journalist or an assistant with you who can just document your day. 0:39:58 And you know, that’s super helpful if you’re a content creator and you are trying to ingest 0:39:59 a lot of things. 0:40:00 I can research all this stuff. 0:40:07 I can help, you know, alleviate that and I can review it saves you from having to carry 0:40:13 a notepad around and take notes all day, which for me, I love that. 0:40:15 So it seems like you’re like a pretty big gadget guy. 0:40:16 I love gadgets. 0:40:19 I’ve got like pretty much every AI gadget that’s come across. 0:40:20 I’ve got it. 0:40:21 I’ve got the rabbit. 0:40:22 I’ve got the plot. 0:40:23 I’ve got the compass. 0:40:25 I’ve got the I’ve got like all the things. 0:40:26 The plot is actually on the way. 0:40:30 I don’t have it yet, but I got a notification that it shipped like a couple of days ago. 0:40:34 But are there any other like really, really cool gadgets that you’re that like more people 0:40:35 should know about? 0:40:37 Yeah, it’s the actually I’m wearing it right now for those that don’t see it. 0:40:41 It’s the Hollyland Lark M2 microphone. 0:40:43 I recommend it because two reasons. 0:40:47 One, it’s got AI noise canceling built right in. 0:40:52 So and it comes with two lapels so you can like have two people do an interview and it’s 0:40:54 very good quality sounding mic. 0:40:57 And then let me just adjust it. 0:40:58 Yeah. 0:41:06 The other thing on that is iOS 18 on iPhone now has a new voice memo feature that has 0:41:14 been and now I’m sitting here like, do I still need to use my flawed for everything? 0:41:20 If I could just have that right in the voice memos app and it saves your transcript right 0:41:27 alongside the audio file and then searchable so you can just search in text your audio 0:41:28 file. 0:41:31 It looks for the word and then it plays the word. 0:41:36 It’s like a like a little like lyrics videos where like the word pops up and you hear this 0:41:37 lyric. 0:41:42 It does that in the freaking built in iPhone and iPad and Mac. 0:41:46 I think an all all iOS 18 devices can do that now. 0:41:49 I’m sitting there like, oh, shoot, okay. 0:41:50 Huh. 0:41:55 So right now I’m kind of using them both for note taking but we’ll see what’s best. 0:41:59 You could compare notes to pull them both into a large language model later and be like, 0:42:01 Hey, what did this one find that this one didn’t find? 0:42:04 What is this here that this one did audio quality is different? 0:42:07 You know, the plot one is not for your audio quality. 0:42:09 It’s a very low quality mic. 0:42:13 It’s high enough to get the details of what happened, but not high enough that you’d use 0:42:15 it professionally for audio. 0:42:24 Whereas the the Hollyland Lark M2, you can actually use that audio as well as, you know, 0:42:27 that you use it as a tool for capturing the data. 0:42:28 Right. 0:42:29 Right. 0:42:31 I imagine the plots probably like optimized for battery life because they expect people 0:42:36 to just have it on all day long, right, where the other one that’s probably more designed 0:42:40 to be turned on and turned off as you get another device. 0:42:41 I recommend. 0:42:43 I’m using it right now for my tripod. 0:42:49 It’s the the Insta 360 flow pro. 0:42:50 It’s nice. 0:42:53 It’s I have it attached to my iPhone right now. 0:42:58 And it’s a robotic arm that can actually do the person tracking. 0:43:01 And so I’m making a lot of content on the go. 0:43:04 And it’s like having a kind of a camera operator with me wherever I go. 0:43:07 And then it folds out three legs. 0:43:08 So it’s also a tripod. 0:43:10 It has an extending neck. 0:43:13 So it looks like a selfie stick. 0:43:18 And with the top of a button, you can orient it as horizontal content or vertical content 0:43:20 kicks butt. 0:43:24 So and it’s everything fits in this tiny little bag carry with me. 0:43:28 Somebody should just make like a somebody should just make a bag like a like a fanny 0:43:32 pack type bag where the whole bag is just like a wireless charger and anything you throw 0:43:33 in the bag. 0:43:34 It just charges it up. 0:43:35 Somebody should make listeners do it. 0:43:36 Please. 0:43:42 I mean, I think we covered a lot of ground on this one. 0:43:46 I know you’re at a conference and probably anxious to get back in and hear what everybody’s 0:43:47 talking about. 0:43:50 Is there any place that people should go and check you out? 0:43:56 But I know Insta Instagram, you post a lot of amazing videos and some of these concepts 0:43:58 over there. 0:43:59 Shout out your Instagram account. 0:44:00 Yeah. 0:44:01 Instagram is my go-to. 0:44:02 I do a lot of live streams. 0:44:03 I share a lot of stuff on threads. 0:44:04 A lot of reels. 0:44:09 So it’s it’s at D-O-N-A-L-L-E-N-I-I-I. 0:44:14 I am Don Allen Stephenson, the third by go by Don Allen, the third on Instagram and Twitter. 0:44:16 But but yeah, just Don Allen, I-I-I. 0:44:18 And you have a new book. 0:44:20 I’ve got it in my hand here. 0:44:25 Take a seat, quick elevator pitch, 30 seconds. 0:44:26 What’s the book about? 0:44:27 Why do people want it? 0:44:30 I talk about how I make opportunities for myself. 0:44:33 The whole idea of making a seat is how do you make opportunities? 0:44:35 And I use a lot of AI to do that now. 0:44:40 So I decided to kind of write a book in three parts, how to discover opportunities with AI, 0:44:44 how to leverage technology like AI, and then also how do you build resilience? 0:44:46 There’s a lot of change that’s about to happen. 0:44:49 And I figure, let me share a bunch of my life stories and tools and techniques and put it 0:44:50 all into a book. 0:44:56 I train in AI to interview me and then a separate AI to organize that into chapters and a third 0:45:02 AI to organize the chapters into a book written in two months, about 30,000 words. 0:45:04 And I stand by it. 0:45:05 Very cool. 0:45:11 So it’s a book co-written by Don Allen Stephenson, the third and a very much so. 0:45:12 Awesome. 0:45:18 But well, thank you so much for taking time out of the masters of scale of it to come 0:45:19 chat with us. 0:45:21 It’s been so much fun. 0:45:25 Always a blast nerd and out with you about AI and tech and gadgets and stuff. 0:45:27 So once again, thanks so much for joining us today. 0:45:28 Really, really appreciate it. 0:45:29 It’s been fun. 0:45:30 Likewise. 0:45:31 And thank you so much for having me. 0:45:32 I really appreciate it. 0:45:33 This is so much fun. 0:45:34 And yeah, all the stuff that you do inspires me. 0:45:36 I reference your work all the time. 0:45:40 And it’s just like, it’s so cool that we can be even bumping into the same circles over 0:45:41 and over again. 0:45:42 Absolutely. 0:45:43 I appreciate it, man. 0:45:44 Good talking to you. 0:45:45 All right. 0:45:50 Thank you so much for joining us. 0:45:51 Thanks. 0:45:52 Thanks. 0:45:53 Bye. 0:45:54 Bye. 0:45:55 Bye. 0:45:56 Bye. 0:45:59 [MUSIC PLAYING] 0:46:01 you 0:46:03 you
Episode 31: What will the future look like when AI agents take over mundane tasks? Matt Wolfe (https://x.com/mreflow) and Nathan Lands (https://x.com/NathanLands) are joined by Don Allen Stevenson III (https://x.com/DonAllenIII), former DreamWorks specialist and author of “Make a Seat”.
In this episode, they dive deep into how AI agents are set to revolutionize our everyday lives by speculating on future AR products, and exploring AI-driven automation in advertising. Don Allen Stevenson III also shares insights from his book on leveraging AI and technology during times of change, and how he used AI to write and organize it.
Check out The Next Wave YouTube Channel if you want to see Matt and Nathan on screen: https://lnk.to/thenextwavepd
—
Show Notes:
(00:00) Former DreamWorks trainer, now independent content creator.
(04:04) Inspired by Black Mirror, created hopeful Clear Mirror.
(07:00) Setup ChatGPT voice shortcuts on Apple devices.
(09:55) Encountered rate limit issues but found automation impressive.
(15:27) Analyze top YouTube thumbnails for design inspiration.
(21:47) Runway Take One: Video to emotion-matched cartoon.
The Next Wave is a HubSpot Original Podcast // Brought to you by The HubSpot Podcast Network // Production by Darren Clarke // Editing by Ezra Bakker Trupiano
These AI Workflows 10x’d Our Productivity (Q&A Special)
AI transcript
0:00:02 We’ve been getting a ton of questions from you guys 0:00:03 over on social media. 0:00:05 You know, how does this all play out? 0:00:08 What does this all look like in the future? 0:00:11 That sort of concept came up over and over and over again. 0:00:13 I’m hoping in the future it’s just simplified. 0:00:15 Here’s the one model. 0:00:17 We’re going to answer a lot of your questions. 0:00:22 Hey, welcome to the Next Wave Podcast. 0:00:23 I’m Matt Wolf. 0:00:24 I’m here with Nathan Lanz. 0:00:28 On this episode, we’re going to answer a lot of your questions. 0:00:31 And we deep dive into some really fun topics. 0:00:33 We’re going to list out a ton of tools 0:00:35 that we’re using in our own businesses, the tools 0:00:37 that we couldn’t live without. 0:00:39 We’re going to talk about the future of work 0:00:42 and what happens if AI takes all of our jobs? 0:00:44 Where do we go from there? 0:00:46 We’re going to talk about large language models 0:00:50 and what we see as the future of large language models 0:00:51 and so much more. 0:00:53 Get a notepad ready. 0:00:54 We go into a lot of stuff. 0:00:57 This is a fun episode, so let’s just go ahead and get right 0:01:00 into it. 0:01:03 When all your marketing team does is put out fires, 0:01:04 they burn out. 0:01:07 But with HubSpot, they can achieve their best results 0:01:08 without the stress. 0:01:11 Tap into HubSpot’s collection of AI tools, 0:01:14 Breeze, to pinpoint leads, capture attention, 0:01:17 and access all your data in one place. 0:01:19 Keep your marketers cool and your campaign results 0:01:20 hotter than ever. 0:01:24 Visit hubspot.com/marketers to learn more. 0:01:29 [MUSIC PLAYING] 0:01:31 So our first question is from nothead.ai. 0:01:35 So he says, where do you think we will see the next best 0:01:37 leap in AI agents? 0:01:39 Yeah, because we’ve heard about agents for so long, 0:01:41 but like nothing’s actually worked yet. 0:01:43 Like there was all the hype, you know, with BabyAGI 0:01:45 and all those that came out and AutoGPT, 0:01:47 what was it, like almost like a year and a half ago. 0:01:48 And then really nothing happened. 0:01:50 So I think a lot of people were really disappointed 0:01:53 that there was all that hype, but the rumor 0:01:55 is that OpenAI has been telling their investors 0:01:59 that the new ’01 model, like not the preview that’s out 0:02:01 right now, but the actual ’01, that they’re 0:02:03 having some like pretty good results with agents. 0:02:06 And so I think currently that’s what I’m betting on 0:02:09 is like, you know, if they’re telling their investors that, 0:02:11 like usually you don’t lie to investors. 0:02:13 So if they’re actually telling investors that, 0:02:15 that probably means it’s at least working somewhat. 0:02:16 So I think that’s going to be the next step. 0:02:17 It’s probably not going to be amazing. 0:02:19 It’s going to be like a lot of these things 0:02:21 where sometimes they get overhyped, 0:02:23 but at least if they’re useful in some use cases, 0:02:25 then you just kind of, it’ll get better from there. 0:02:27 – Well, I know like the Rabbit R1, right? 0:02:29 Which was just like horribly reviewed 0:02:31 by everybody who got their hands on it. 0:02:35 Well, they just now started to roll out the large action model. 0:02:39 And supposedly it’s pretty decent now. 0:02:42 Like it can actually watch things on your screen. 0:02:44 So you train it once on how to do something. 0:02:46 Like you can train it on how to go buy something for me 0:02:47 on Amazon. 0:02:50 Once you’ve done it once, it sort of learns how to do that. 0:02:52 And then next time you can say, 0:02:57 hey, go buy me a new water bottle on Amazon or whatever. 0:03:01 And it will go and actually go through all the steps. 0:03:02 I actually have a rabbit. 0:03:03 I haven’t tried that yet, 0:03:06 but I hear they’re actually making some good strides there 0:03:09 on the rabbit, but it’s so hard to say. 0:03:11 Like I feel like everybody’s vision 0:03:14 of what an AI agent is going to be 0:03:16 is like slightly different. 0:03:17 Like we’ve already got perplexity, 0:03:19 which is kind of like an AI agent 0:03:22 where it will search out one query based on what it finds, 0:03:23 search out another query, 0:03:26 go and possibly even a third query 0:03:28 and then present you with all the information 0:03:29 that it came up with, right? 0:03:30 We’ve already got that. 0:03:33 You can already create like semi AI agents 0:03:36 with things like make.com and Zapier. 0:03:41 And there’s a tool called MindStudio and agents.ai 0:03:45 right is the one from Darmesh over at HubSpot. 0:03:47 You’ve got all of these tools 0:03:50 that are kind of agentic already, right? 0:03:53 You’re basically having the AI go and use this tool 0:03:56 via an API and then you can connect all of these APIs 0:03:59 and all of these AIs together 0:04:01 and get what you’re looking for. 0:04:03 But they’re still kind of like convoluted 0:04:06 and complex and sort of tough to build. 0:04:10 But I feel like AI agents is sort of like a tough thing 0:04:13 to really predict where it’s going to end up 0:04:15 because everybody kind of needs them 0:04:17 to do different things for them, right? 0:04:19 So I don’t know if there’s going to be 0:04:21 at least in the very, very near term 0:04:23 like a one size fits all AI agent 0:04:25 where people are like, this is it. 0:04:28 We got the AI agent that everybody was looking for. 0:04:29 I think we’re gonna have like a whole bunch 0:04:33 of little sparks of agents all over the place 0:04:35 that all do little different things. 0:04:37 And we’re probably still a little ways off 0:04:39 before it’s like a personal assistant 0:04:42 where you can just tell it to do anything you want, you know? 0:04:43 – Yeah, I mean, I think you’re gonna see lots 0:04:44 of different ones where it’s just 0:04:46 for like a specific use case. 0:04:47 Like you just said, I saw a tweet 0:04:49 from Dan Schipper the other day 0:04:51 where he’s got something they’re gonna be releasing 0:04:52 where it looks like it’s an agent 0:04:53 that like goes through all your emails 0:04:55 and like responds to the ones 0:04:57 that are like obviously just needed very simple responses 0:04:58 and you kind of programmed it. 0:05:02 What kind of simple responses are okay for it to give? 0:05:03 And then also like, you know, 0:05:04 probably checks the ones that are spam 0:05:06 and the ones that you don’t need to respond to, 0:05:07 the ones are important. 0:05:08 And it summarizes all of it for you 0:05:09 and helps take care of them. 0:05:11 So I can see that we’re gonna have all these different agents. 0:05:12 – Does that exist already? 0:05:14 Is that something that Dan’s building? 0:05:15 – Well, he showed a screenshot of it. 0:05:17 Like basically like that it’s working internally 0:05:19 and he like kind of said, like something’s coming soon. 0:05:21 So, and you know, I can imagine that 0:05:24 with something like agent.ai, what Darmesh is doing. 0:05:25 That makes a lot of sense. 0:05:26 I hope I have some kind of directory 0:05:28 or something like this where it’s like, 0:05:29 here’s an agent for your email. 0:05:32 Here’s an agent for like sales outreach. 0:05:33 Here’s an agent for whatever. 0:05:36 But it’s probably, it probably is several years away 0:05:38 before we have like an all, one that does all of it. 0:05:40 Like maybe that’s like five years away. 0:05:41 – Yeah, yeah. 0:05:43 I mean, I’ve already started building like little things 0:05:45 like the email one. 0:05:47 Like you can use a tool like make.com 0:05:52 and you can use folders or tags inside of Gmail 0:05:55 and basically have make.com watch for any time 0:05:57 an email goes into a specific folder. 0:06:00 And if you put an email in a specific folder, 0:06:03 then make.com follows through on a set of actions for you, 0:06:04 right? 0:06:06 So let’s say like, for example, you get an email 0:06:09 that you’re like, okay, this one is something 0:06:10 I don’t personally need to respond to 0:06:13 is something that I respond to often. 0:06:14 So I have sort of like a stock response 0:06:17 that I send every time I can throw it into a folder 0:06:20 that’s like use one of my stock responses, right? 0:06:23 And then make.com could look in that folder 0:06:26 whenever a new email comes in, read the email 0:06:29 and then sort of decide how to reply based on like a handful 0:06:31 of potential set responses. 0:06:34 And it could save like a whole bunch of time with email, right? 0:06:37 So there’s already like stuff like that that you can do 0:06:41 but I just feel like the definition of an AI agent 0:06:44 or what the expectation of an AI agent is, 0:06:46 it’s sort of a moving goalpost, right? 0:06:50 It’s like AGI, like nobody sort of agrees on what AGI is. 0:06:53 Well, an agent, like the agent that I want 0:06:55 to help me in my business is probably a little bit different 0:06:57 than the agent you want to help you in your business. 0:06:59 It’s probably a little bit different than the agent 0:07:03 that an airline pilot wants for their career, right? 0:07:06 Like everybody wants slightly different things. 0:07:08 And I think it’s the nuance 0:07:10 that makes it sort of complicated right now. 0:07:13 – Yeah, I think OpenAI said that level three, 0:07:15 they’re trying to have like different levels of AGI 0:07:18 ’cause yeah, like defining AGI is actually quite hard. 0:07:20 Like everyone has different ideas of what it means. 0:07:22 I think they put level three as agents 0:07:24 and level two is reasoning and that makes sense 0:07:26 ’cause like that’s the reason the agents didn’t work 0:07:28 is ’cause agents require some level of reasoning. 0:07:30 You know, you give them some kind of tasks, 0:07:32 they go off and get confused like maybe AGI 0:07:34 and the other ones got confused. 0:07:35 And so they have to have some kind of reasoning 0:07:36 to actually get past that confusion 0:07:38 and figure out what to do next. 0:07:41 So if they have nailed reasoning with 01, if they really have, 0:07:42 then I think we will start to see 0:07:44 some really cool useful agents. 0:07:46 Ones that actually go off and do stuff for you. 0:07:46 – For sure. 0:07:50 So the next one is from the Jacob Gooden. 0:07:51 He’s actually a buddy of mine. 0:07:53 He used to be the producer of my old show, 0:07:54 Hustlin Flowchart. 0:07:55 Still is the producer of Hustlin Flowchart. 0:07:57 I’m not on that show anymore. 0:07:58 But he asked a good question here. 0:08:00 What is your go-to AI tool? 0:08:03 The thing you use every day and would miss it 0:08:04 if it went away tomorrow? 0:08:06 – Yeah, so mine has become, 0:08:07 I know yours is probably perplexity. 0:08:09 Is that, is that? 0:08:10 – Mine would be two. 0:08:11 Like I think there’s two, right? 0:08:12 Perplexity and Claude. 0:08:16 Those are the two that I use like every single day. 0:08:20 – Yeah, so mine has become the ChatsPT voice 0:08:21 or whatever they’re calling. 0:08:22 I’m just gonna call it ChatsPT voice. 0:08:24 I know they’ve got different names for all their stuff. 0:08:27 But I’m still finding a ton of value of that. 0:08:29 Like talking to it every single day. 0:08:30 I use it for personal stuff. 0:08:32 Like what am I doing today? 0:08:33 What’s my schedule? 0:08:36 Or just things I’m thinking through to take notes. 0:08:37 I use it like that in the morning. 0:08:41 I’ve been using it to help translate with my wife. 0:08:43 And also my son has a ton of fun using it. 0:08:47 It’s been like a really cool way to explain AI to him 0:08:49 and show him what’s possible. 0:08:50 And also perplexity. 0:08:52 I’ve been using perplexity more and more. 0:08:53 Like as I said, 0:08:56 I set it to my default search engine in my browser. 0:08:58 And so when I type in something in the browser now, 0:09:00 it just pops up in perplexity. 0:09:02 And 90% of the time I’m finding that that’s better 0:09:05 than the Google results, honestly. 0:09:08 And then also they just keep releasing, 0:09:09 they’re releasing new things so fast. 0:09:12 They’re like, people will suggest things on Twitter. 0:09:14 And then Aravind, the CEO will see it and respond to it 0:09:16 and say, cool idea or whatever. 0:09:17 And then like a week later, 0:09:19 you’ll see him like post a screenshot 0:09:22 or a link to the thing, right? 0:09:24 – Yeah, I mean, going back to the original question, 0:09:27 I think, yeah, for me, it’s perplexity and clot. 0:09:29 I mean, perplexity for all the reasons you just mentioned. 0:09:31 I love it for research, right? 0:09:32 I love going in there and saying, 0:09:35 hey, I’m trying to deep dive on this topic. 0:09:36 What can you find for me? 0:09:37 And then it’s really good 0:09:39 at suggesting follow-up questions too, right? 0:09:42 Like it even kind of takes the thinking out of, 0:09:44 well, what should I ask next to learn more about this? 0:09:46 ‘Cause it gives you like five potential questions 0:09:47 to ask next. 0:09:49 So I really like just sort of going down 0:09:51 a perplexity rabbit hole. 0:09:54 And I like, I use perplexity all the time, 0:09:56 like constantly to sort of research topics. 0:09:59 And then Claude really, really helps me dial in my videos, 0:10:02 right? Especially the short form videos. 0:10:05 Notebook LM has been really, really cool. 0:10:06 I love plugging in stuff into that 0:10:10 and like listening to a podcast back about a topic. 0:10:13 But yeah, if I had to, if there was one tool 0:10:15 that was like, if this was gone tomorrow, 0:10:16 I’d be really, really bummed out. 0:10:19 It’d probably be perplexity would be first place. 0:10:21 Claude would be second place. 0:10:24 – Yeah, I finally tried Notebook LM 0:10:25 with my son over the weekend too. 0:10:28 So we tried to rep lit and then we tried Notebook LM. 0:10:30 Same thing where we were using chat GPD voice, 0:10:33 like chatting with it, getting ideas. 0:10:36 And then just typing it right into Notebook LM. 0:10:37 – There was a video going around, 0:10:38 I don’t know if it was on Twitter or something, 0:10:40 but there was a video going around 0:10:41 where somebody made a text file 0:10:45 and they just put the word poop into it like 2,000 times. 0:10:48 And then they uploaded it into Notebook LM. 0:10:50 And like, they made a whole 10 minute podcast 0:10:53 about the document that just had the word poop 0:10:56 posted into it like 10,000 times, right? 0:10:57 And they’re just like, you know, 0:10:59 we’ve talked about a lot of things on this podcast, 0:11:01 a lot of really interesting things, 0:11:03 a lot of non-interesting things. 0:11:05 And today I think we’ve got the most interesting document 0:11:06 we’ve ever seen. 0:11:08 This document is just the word poop 0:11:10 over and over and over again. 0:11:11 – You know, it sounds ridiculous, 0:11:13 but like me and my son had so much fun with it, 0:11:14 like so much fun. 0:11:16 It’s like, God, that’s like, you know, 0:11:18 we’ve talked about like generative entertainment 0:11:20 and stuff in the past and past episodes. 0:11:23 It’s like, but that’s like one of the first examples 0:11:25 of like generative entertainment, right? 0:11:27 Or it’s like, yeah, in the future, 0:11:28 you’re gonna have more and more stuff like this 0:11:31 where you just like generate the stuff that you enjoy, 0:11:32 you know, ’cause everyone’s different. 0:11:34 And like there’s stuff that I find funny, 0:11:36 those people are like, God, that’s stupid. 0:11:40 – Yeah, all right, so moving over to Twitter slash X now, 0:11:43 Railia says, have you found any tools 0:11:44 that are great for coming up 0:11:47 with YouTube video ideas slash titles? 0:11:49 I know a couple of tools, vidIQ, 0:11:51 Spotter Studio, Creator ML, 0:11:53 would be curious to know what else you have played with 0:11:54 and what is working well. 0:11:57 So I feel like this question’s probably directed at me, 0:12:00 but I mostly still use Claude for a lot of this stuff. 0:12:01 I do have a vidIQ account. 0:12:02 I do have a Spotter account. 0:12:04 I do have a TubeBuddy account. 0:12:06 I have like all of those tools, 0:12:09 but most of those are like, 0:12:11 I use Spotter to help me come up 0:12:14 with thumbnail ideas mostly, right? 0:12:19 I use TubeBuddy and vidIQ more for like data analysis 0:12:22 and to watch the stats and to see it 0:12:24 and to like test thumbnails and things like that. 0:12:27 I still use Claude to help me come up with thumbnails 0:12:30 or with titles and like hooks and stuff like that, right? 0:12:31 Like I’ll go to Claude and say, 0:12:34 hey, I want to make a video about X, Y and Z, 0:12:37 help me come up with a good title for this video. 0:12:39 Or what I do a lot of times now 0:12:40 is I’ll record the whole video, 0:12:42 get the transcript from the video, throw it into Claude 0:12:45 and say, here’s a transcript from a video I recently made, 0:12:47 help me come up with a title for it, right? 0:12:52 So I still kind of use the bare bones tools 0:12:54 that aren’t actually designed for YouTube 0:12:57 because a lot of these tools that were designed for YouTube 0:13:01 are just using things like OpenAI or Claude or Llama 0:13:03 or one of these models underneath anyway. 0:13:06 So I’m just sort of like skipping the middle man, honestly. 0:13:10 – We’ll be right back. 0:13:12 But first, I want to tell you about another great podcast 0:13:13 you’re going to want to listen to. 0:13:15 It’s called Science of Scaling, 0:13:17 hosted by Mark Roberge. 0:13:19 And it’s brought to you by the HubSpot Podcast Network, 0:13:23 the audio destination for business professionals. 0:13:25 Each week, host Mark Roberge, 0:13:27 founding chief revenue officer at HubSpot, 0:13:29 senior lecturer at Harvard Business School 0:13:32 and co-founder of Stage 2 Capital, 0:13:35 sits down with the most successful sales leaders in tech 0:13:37 to learn the secrets, strategies and tactics 0:13:40 to scaling your company’s growth. 0:13:42 He recently did a great episode called, 0:13:45 how do you solve for a siloed marketing in sales? 0:13:47 And I personally learned a lot from it. 0:13:49 You’re going to want to check out the podcast, 0:13:50 listen to Science of Scaling, 0:13:53 wherever you get your podcasts. 0:13:57 But like, may not cue you, 0:13:59 what do you use it mainly for like stats on videos 0:14:00 and things like that? 0:14:01 – So may not cue you, I mostly use, 0:14:03 ’cause they’ve got like a really nice like sidebar 0:14:05 that automatically shows up on YouTube 0:14:07 and it’ll show you if like thumbnails have been recently 0:14:09 changed or titles have been recently changed. 0:14:13 So it’s more to like analyze other videos on other channels 0:14:16 than it is for like my own channel, honestly. 0:14:17 I can go look at other videos 0:14:19 and see how well they’re performing 0:14:21 compared to like their normal videos. 0:14:22 It also puts like a little, 0:14:24 like a multiplier below the video. 0:14:27 So it’ll say this video is performing at like 1.5x, 0:14:29 the normal video on this channel. 0:14:31 This video is performing at 50x the normal video. 0:14:34 This video is performing at 0.5x, 0:14:36 like it’s underperforming for this channel. 0:14:39 So I use it a lot like that to see what types of videos 0:14:40 and titles and thumbnail combinations 0:14:42 are working for other people. 0:14:45 Just to kind of get ideas and stay looped in. 0:14:47 I don’t really copy other people’s ideas, 0:14:50 but like it’s more keeping my finger on the pulse. 0:14:54 Saying that TubeBuddy is an AI first company. 0:14:55 Like they’re actually, 0:14:57 they actually got purchased by a company called Ben Labs 0:15:01 and Ben Labs was a giant like AI research company. 0:15:05 So like TubeBuddy is sort of like built around AI these days. 0:15:09 But yeah, I don’t really know the history 0:15:11 too much of vidIQ or anything like that. 0:15:13 And Spotter actually is really, really good 0:15:15 at generating thumbnail concepts. 0:15:18 You can give it an idea for a video and it’ll use, 0:15:20 I don’t know what AI models are using under the hood, 0:15:22 but it will generate like thumbnails 0:15:26 based on what your channel thumbnails normally look like. 0:15:29 So I will go and I will give it an idea for a video. 0:15:31 It’ll make a thumbnail that looks similar 0:15:33 to like my most popular thumbnails, 0:15:35 but with like new elements in it. 0:15:37 So it actually is sort of trained in 0:15:39 on my existing thumbnails on my channel. 0:15:43 Like it even tries to make like a sort of face 0:15:45 that looks like mine, not really good, 0:15:48 but it’ll make a like a bearded man 0:15:50 that looks somewhat close to me in the thumbnail 0:15:53 just to give you the concept, you know? 0:15:54 – Right. 0:15:54 That’s cool. 0:15:56 Are you actually using that, the Spotter? 0:15:58 Or is that just giving you ideas? 0:15:58 That’s so cool. 0:16:00 – Yeah, well, I don’t actually take 0:16:02 the thumbnail straight from Spotter. 0:16:04 In fact, the thumbnail that it generates for you, 0:16:07 it actually has text on it that says like generated with AI. 0:16:10 Right? So like, I can’t just take the thumbnail straight 0:16:12 from Spotter and upload it to YouTube, 0:16:14 but it’ll give you a concept. 0:16:18 And I can take that concept and use it as like an image 0:16:22 to image inside of stable diffusion 0:16:24 and have stable diffusion generate something 0:16:25 that looks similar. 0:16:29 Or what I’ve done, if I really, really like the thumbnail 0:16:32 it generated for me is I take the thumbnail it generated, 0:16:35 pull it into Photoshop, put a little square around the area 0:16:37 where it says generated with AI. 0:16:40 And then use generative fill to just remove it. 0:16:42 – So we just discovered a business idea 0:16:43 for anyone listening. 0:16:45 (laughing) 0:16:47 You literally could just go copy Spotter 0:16:51 and not say made with AI and have a business right there. 0:16:51 So. 0:16:53 – Yeah, maybe. 0:16:56 They have some like proprietary stuff behind the scenes 0:16:59 to like actually learn on your channel 0:16:59 and stuff like that. 0:17:02 – Yeah, not that simple, but if you, yeah, 0:17:03 someone’s smart enough. 0:17:04 – Yeah, yeah, yeah, for sure. 0:17:09 Let’s see, coal mine canary says you should address 0:17:12 the topic of there being too many AI tools 0:17:14 and people’s inability to afford them. 0:17:16 It’s going to be a problem. 0:17:20 Now that specific question kind of concept 0:17:23 came up a lot on my, this like Twitter thread. 0:17:26 So when I asked you to ask these questions, 0:17:28 the thought of there’s way too many tools, 0:17:31 they all want like a monthly fee to use them. 0:17:33 You know, how does this all play out? 0:17:36 Like what does this all look like in the future? 0:17:40 That sort of concept came up over and over and over again. 0:17:41 And I personally think it’s a problem, right? 0:17:44 Like as the guy who runs future tools 0:17:46 where people are like submitting tools to me every day 0:17:49 and I’m seeing like a hundred new tools every day, 0:17:52 I only approve like 1% of the tools these days 0:17:53 that get submitted to me. 0:17:56 Because so many of them do the exact same thing 0:17:58 and so many of them just feel like 0:18:00 cheap low effort money grabs, right? 0:18:05 People will go and go, oh, I can use the flux, 0:18:07 you know, flux pro API. 0:18:09 Cool, I’m going to go make an AI image generator 0:18:11 and charge 20 bucks a month for it, right? 0:18:13 And all they’re doing is putting a wrapper 0:18:14 around the flux API. 0:18:16 And I just see so much of that 0:18:19 and it’s just low quality and it’s junk, right? 0:18:21 So here’s what are your thoughts 0:18:24 about the like oversaturation issue? 0:18:25 – Yeah, I guess I have a lot of thoughts. 0:18:26 I mean, it feels like right now 0:18:27 we’re in like this exploratory phase 0:18:29 where it’s good that that’s happening 0:18:31 ’cause people are out there trying all these things 0:18:33 so we get to explore all the possibilities 0:18:36 before we set it, you know, before things stabilize 0:18:37 and it’s like, oh, here are like the five things 0:18:39 that everyone uses, right? 0:18:40 ‘Cause over time, that probably will happen. 0:18:42 Like, you know, people say they hate monopolies 0:18:44 but that is one of the benefits of a monopoly. 0:18:46 You’ll have less, you’ll have more things 0:18:48 baked into one product over time 0:18:50 and then you’ll pay like one fee for that. 0:18:51 And I think we will have that. 0:18:53 Like I think in like three years from now 0:18:55 you’ll probably have like five things that people pay for 0:18:57 and most people will probably pay for one or two things, 0:18:58 would be my guess. 0:18:59 – Or you don’t pay for any of them 0:19:01 and they’re ad supported, right? 0:19:04 – Yeah, but, you know, there’s something else though 0:19:06 that I’ve been thinking about recently 0:19:08 ’cause like there’s been like, you know, whispers 0:19:11 that like the new, the models from opening AI 0:19:14 in the future, the ones that are gonna be supposedly 0:19:16 amazing and like people are gonna think it’s AGI. 0:19:20 Why do people assume that that’s gonna be $25 a month 0:19:20 or whatever? 0:19:22 And they may not be. 0:19:25 Like some of these future models may be very expensive. 0:19:27 Like we may start to have a divide there where it’s like, 0:19:29 yeah, there’s like the $25 a model 0:19:31 and you’re getting what you get now 0:19:34 or you pay like a thousand or 10,000 a month 0:19:36 and you get access to AGI. 0:19:39 And ’cause it may cost a lot to run these future models 0:19:40 and they may be absolutely amazing 0:19:42 and very expensive to run. 0:19:45 So that’s kind of, that’s where I’m actually kind of concerned. 0:19:47 – If you pay 10,000 a month for AGI 0:19:50 can I go tell my AGI to go make me a business 0:19:52 that makes more than 10,000 a month? 0:19:54 – Well, yeah, that’s the thing is like, yeah, 0:19:56 this may literally, you know, ’cause we’ve, 0:19:58 I think we’ve talked a little bit in the past about like, 0:20:01 AI could help, you know, 0:20:03 ’cause right now there’s huge wealth disparity 0:20:04 all around the world, right? 0:20:07 And in theory, AI could help with that 0:20:10 by giving everyone more opportunities and things like that. 0:20:12 But also we could, in a particular situation 0:20:14 where only some people can afford the AI 0:20:16 and everybody who can afford it and use it well, 0:20:19 they get way ahead of everyone else, right? 0:20:22 ‘Cause like, yeah, if you can pay $10,000 a month 0:20:24 to use the newest open AI model 0:20:27 and it’s basically like hiring 20 employees or something, 0:20:28 nothing that’s gonna happen now, 0:20:30 but like, let’s say in three years or something, 0:20:31 that could really change your life. 0:20:34 Like, and people who can’t afford that, 0:20:35 it’s not gonna be great for them. 0:20:37 So I do wonder about that. 0:20:40 Like I think people are not thinking about that yet about, 0:20:42 yeah, these things cost a lot right now, 0:20:46 but they may cost a lot more in the future. 0:20:48 – I can see it going either direction honestly, 0:20:51 because I do know like they’re going to continually work 0:20:54 to get the cost of compute down as well, right? 0:20:57 I think as it gets more and more intelligent, 0:20:59 it’s going to require more compute power, 0:21:01 but at the same time, 0:21:03 everybody’s trying to lower the cost of compute, right? 0:21:06 Like NVIDIA is trying to make their chips 0:21:09 more affordable to build and more efficient 0:21:10 and all that sort of stuff. 0:21:12 – And Microsoft and others also, 0:21:14 I think they’re starting to invest into nuclear as well. 0:21:17 So like, we’re gonna need more energy as well 0:21:18 and like cleaner energy. 0:21:20 So I think that’s gonna be one of the good things, 0:21:21 is like, yeah, we’re gonna have probably cleaner energy 0:21:23 all around and more energy. 0:21:25 – So that’s a good thing. 0:21:26 Yeah, I’m not trying to make it negative, 0:21:28 but that is kind of like what’s been 0:21:29 the back of my mind recently. 0:21:33 Like, oh yeah, maybe these really will require like, 0:21:35 you may have to pay like a thousand to $10,000 a month 0:21:37 for like the best models. 0:21:39 – Yeah, I don’t know. 0:21:41 I mean, personally, I think AGI is probably 0:21:43 a little bit farther out than like three years, 0:21:46 but I don’t know, maybe I’m being pessimistic. 0:21:51 I think with like the tool like sort of overload concept, 0:21:53 the saturation of just too many tools, 0:21:54 too many monthly payments. 0:21:56 I do think that’s an issue right now, 0:22:00 but I also do think there will be like a consolidation, 0:22:01 like you mentioned. 0:22:03 I think a lot of people are gonna be using the Googles, 0:22:06 the Microsofts, the, you know, maybe OpenAI, 0:22:09 maybe Anthropic, maybe some of these other companies 0:22:11 all just have like a single platform 0:22:13 where you can generate your images, 0:22:16 generate your videos, generate your text, 0:22:18 you know, have this sort of agentic features 0:22:21 and you pay the one service and it could kind of do it all. 0:22:24 I think that’s eventually where it’s going to get 0:22:27 in the near term, but right now I do think it’s a problem. 0:22:30 I think, I also think there’s a lot of people out there 0:22:32 that are just absolutely delusional 0:22:35 with like their product ideas. 0:22:37 I think so many people are going out there and going, 0:22:41 hey, look, I just created an AI tool 0:22:43 that can write children’s books for you. 0:22:45 It’s 50 bucks a month, right? 0:22:47 But then like somebody will go use it once 0:22:51 and be like, cool, I made an okay children’s book 0:22:54 that’s nothing special. 0:22:56 Why am I gonna keep paying monthly for that, right? 0:22:59 Like I see all the time on the Future Tools website, 0:23:02 people send to me like AI tattoo generator 0:23:03 for 10 bucks a month. 0:23:05 Who wants to pay 10 bucks a month 0:23:06 for an AI tattoo generator? 0:23:08 At the very least, I’m gonna pay once, 0:23:11 get my tattoo idea generated, go get my tattoo 0:23:12 and then I don’t need you anymore. 0:23:14 Like I think there’s so many people 0:23:15 that are just delusional with the products 0:23:16 they put out there thinking 0:23:19 that there’s an actual business model behind them 0:23:20 but really it’s just like a, 0:23:22 this is a cool feature that I’m trying to make money off 0:23:25 of quick while AI is hot and in the news 0:23:27 but it’s not a good product. 0:23:29 – Yeah, I mean, in the non-AI world right now, 0:23:30 like the trend is to have stuff 0:23:32 where you like pay for software once 0:23:34 or that’s like kind of a movement happening right now. 0:23:36 You pay for it once and then you own it forever. 0:23:37 – Yeah. 0:23:38 – But with AI, you can’t do that. 0:23:38 So that is the problem, right? 0:23:41 Like these things do cost money to run the models. 0:23:43 It’s not cheap. 0:23:45 And so you kind of have to charge like that. 0:23:47 So there is an issue where a lot of these products 0:23:51 where they’re charging, the value is not there yet. 0:23:53 And I think it’s coming sooner than it sounds like you do 0:23:56 but as of right now, a lot of these products, yeah, 0:23:57 you pay like 25 to 50 bucks a month 0:24:00 and most of them are not worth it really. 0:24:02 – Well, I also think there’s probably a future 0:24:04 and I don’t know how far off I am on this future. 0:24:06 Like I don’t know if it’s within three years, 0:24:07 within 10 years, within 20 years, 0:24:09 but I do think there’s a future 0:24:12 where like most people have like a super computer 0:24:14 in their home, right? 0:24:16 So like an on-device AI 0:24:18 that’s doing all of this stuff for them, 0:24:20 but it’s in their house. 0:24:22 So they don’t actually have to send it off to a data center 0:24:24 or cloud GPUs or whatever, right? 0:24:27 Like I think a lot of, I think that will happen as well. 0:24:32 I think there’s like a big future in on-device inference 0:24:35 for running the AI’s locally, right? 0:24:38 Like if we’re gonna have our own Jarvis, right? 0:24:40 That’s cleaning our house and doing our dishes 0:24:43 and doing our laundry and vacuuming our floors 0:24:48 and, you know, does everything for us on a daily basis. 0:24:53 I have a really hard time seeing that all be in the cloud, 0:24:53 right? 0:24:57 What happens if that cloud service goes down for the day? 0:24:59 All right, everything I do with my life, 0:25:02 I can’t do today because that service is down. 0:25:05 What happens if like there’s internet outages? 0:25:09 Okay, now I can’t actually run all of the stuff 0:25:11 that I’ve been running in my life 0:25:14 because I can’t contact Google services, you know, 0:25:15 servers today, right? 0:25:18 So I do think that there’s probably also a future 0:25:20 where it might only be for the wealthy. 0:25:22 I don’t know, but I do think people are gonna have 0:25:24 like super computers in their home 0:25:28 that can like run these massive AI systems at some point. 0:25:29 – Yeah, I was thinking about that. 0:25:31 You’re talking about Jarvis, but like, yeah, Tony Stark 0:25:32 is supposed to be a billionaire, right? 0:25:33 In the stories. 0:25:35 – Yeah, exactly. 0:25:37 – So maybe, you know, it is a billionaires 0:25:38 who have the local models. 0:25:41 ‘Cause I mean, in the future that could happen, 0:25:43 but it feels like for a long time, 0:25:44 you’re gonna need a lot of compute 0:25:45 to make these models really useful. 0:25:47 And so obviously the ones in the cloud 0:25:48 where they’re benefiting from the large scale 0:25:51 of all the servers being one place and remotely, 0:25:52 that’s gonna always be way better 0:25:53 for at least for a while. 0:25:55 – Yeah, well, I mean, I think the training 0:25:56 can happen remotely, 0:25:59 but the inference would happen locally, right? 0:26:02 So, you know, when they actually like train and open, 0:26:04 I know you know this, I’m just more saying this 0:26:05 for like the audience, right? 0:26:07 But like when they train these big models, 0:26:10 like the newest version of like Claude, 0:26:13 the newest version of chat, GPT, things like that, 0:26:16 a lot of times it takes like millions of dollars 0:26:18 and months and months and months and months 0:26:20 to train a new model, right? 0:26:23 I don’t think we’re very close to doing that at home, 0:26:24 but the inference part, 0:26:26 the part where we ask it the question, 0:26:30 it sort of queries the database and then responds to you, 0:26:32 that’s a lot less compute intense. 0:26:33 So I think, 0:26:35 – Well, I’m not sure. 0:26:38 So like based on what OpenAI is saying about the 01 model, 0:26:40 it seems like it’s gonna be more and more intensive 0:26:41 on the inference side. 0:26:43 So if that is how they start scaling things up, 0:26:45 and if let’s say the anthropic, 0:26:47 I assume maybe anthropics going the same route, 0:26:49 maybe they’re like a few months behind OpenAI or whatever, 0:26:52 we’re probably gonna see all these guys like go into the, 0:26:53 you know, we’re basically, 0:26:54 inference is gonna be more and more important. 0:26:56 Like, and so I don’t know, I’m not sure. 0:26:59 Like I think inference is gonna be one of the ways 0:26:59 that this all scales, 0:27:02 ’cause like the logic side is what’s been missing. 0:27:04 And so it seems like that’s where they’re all gonna be 0:27:05 focusing on. 0:27:06 – Yeah, yeah. 0:27:09 But I mean, I still think that the inference side, 0:27:14 it’s going to, like the stuff that OpenAI is talking about 0:27:16 is more for like the really sort of complex 0:27:19 mathematical stuff in the current state. 0:27:22 I don’t really think, 0:27:23 I don’t know how to word this, 0:27:25 but I don’t really think that like, 0:27:28 making it take a lot longer on the inference side 0:27:29 is what people are gonna want. 0:27:31 So I don’t, like I think they’re gonna have to figure out 0:27:34 how to like really, really shrink that time down. 0:27:35 So when you give it a task, 0:27:38 you get the response quickly, right? 0:27:40 I think slowing it down is cool 0:27:43 if I need to like analyze a complex math problem 0:27:46 or write some code for me or something like that. 0:27:50 But if I need to just like get a response really quickly 0:27:53 on the best ice cream shop near me, 0:27:55 I don’t want it to like analyze for 15 minutes 0:27:56 before it gives me a response. 0:27:58 – Yeah, yeah, but that’s where you would scale up compute. 0:28:01 So if you scale up compute, you could do that faster. 0:28:03 Like, so I think those systems, like they’ll, 0:28:05 they’ll want that to be fast. 0:28:07 They know that that’s not usable for regular people, 0:28:10 like waiting, you know, 20 seconds or a minute or whatever, 0:28:11 like people are gonna expect 0:28:13 these things are happening almost instantly. 0:28:14 – Yeah, yeah, yeah. 0:28:15 We’re getting very theoretical here. 0:28:17 Let’s see. 0:28:19 All right, so RumblePak News says, 0:28:21 how will AI change content creation? 0:28:25 Like being a YouTuber, will AI replace YouTubers? 0:28:30 I personally think that as AI gets more and more prolific 0:28:34 and more and more people use AI and turn to AI 0:28:36 to get responses to their questions 0:28:37 and to learn about things, 0:28:41 I also think simultaneously being a real human 0:28:44 that people actually like relate to 0:28:47 and know is a real human with real human thoughts 0:28:50 is also going to become more important at the same time. 0:28:52 Right, because I think the lines are gonna get blurred 0:28:55 really, really quickly between what content 0:28:58 was actually generated by AI and what content wasn’t. 0:29:00 I mean, when it comes to like written content, 0:29:01 those lines are already blurred. 0:29:04 It’s already like nearly impossible to tell 0:29:06 whether an article was completely written by AI 0:29:08 or completely written by a human 0:29:10 or some sort of hybrid of the two, right? 0:29:13 And I think you’re gonna see that happen more and more 0:29:16 with video and audio and things like that as well. 0:29:19 And as those lines blur, I think people like us 0:29:21 who are actually putting our face out there, 0:29:24 our voice out there, our opinions, our predictions, 0:29:25 those kinds of things, 0:29:27 I think there’s going to be value in that 0:29:30 in a world where people are having a hard time 0:29:33 telling the difference between reality and not reality. 0:29:35 You know, we’ve already seen this a little bit 0:29:36 like with like VTubers, you know, 0:29:39 where they have these like virtual avatars 0:29:41 that then they talk and they’re like cute 0:29:43 and people watch them on Twitch and things like that. 0:29:45 And that’s kind of like a niche where I feel like 0:29:46 probably AI videos may be similar 0:29:48 where people are gonna think that’s cool 0:29:49 and some people are gonna be into it 0:29:51 and you’ll see some really crazy stuff with it 0:29:54 ’cause you can basically take VTubers to the next level 0:29:56 where like there’s all this interactive stuff going on. 0:29:57 That’ll be fun. 0:29:59 But I think I’ll just be like a niche. 0:30:00 But I think like you said, 0:30:01 more and more people are gonna want 0:30:03 some kind of human connection 0:30:04 and to feel like they have a relationship 0:30:05 with this person that they’re watching 0:30:06 and that they’re learning from 0:30:08 or that they’re enjoying their content. 0:30:10 So I don’t think, if anything, 0:30:13 I think actually maybe YouTubers and having a personality 0:30:14 and anything you do in business 0:30:16 is gonna be more and more important in the future. 0:30:18 I mean, ’cause like, I do believe we’re heading 0:30:21 to the point where you can spin up a company 0:30:23 and maybe you do pay the $10,000 AI model 0:30:24 and you’re saying, 0:30:27 hey, I’m gonna copy so-and-so’s company 0:30:29 and I’m gonna throw these resources at it 0:30:30 and try to beat them up. 0:30:32 You’re gonna see tons of this. 0:30:34 Business is gonna get more and more cutthroat. 0:30:35 And so because of that, 0:30:36 having some kind of personality 0:30:38 that people actually care about, 0:30:39 that’s where you could have some more loyalty 0:30:41 that people are like, oh, I like Matt, 0:30:44 I like Nathan, I like HubSpot, whoever, right? 0:30:46 I think that can be more and more important over time. 0:30:46 So if anything else, 0:30:48 if anything, I would be doubling down 0:30:50 on making sure you have a personality 0:30:51 in the work that you do. 0:30:52 – Yeah, yeah. 0:30:54 And I think the types of channels 0:30:57 that might struggle are more of like the faceless channels 0:30:59 that are creating like informational content 0:31:01 that just have a voiceover and nothing else. 0:31:03 I think that kind of content 0:31:05 is probably gonna become more and more of a struggle, 0:31:09 mostly because I think it’s going to get way over saturated. 0:31:11 As AI, as, you know, we’re gonna get to a point 0:31:13 where you can just say generate me a video 0:31:15 on how quantum computing works. 0:31:17 And it’s gonna spit out a 15 minute video 0:31:19 with a voiceover, with background music, 0:31:21 with sound effects, with B-roll. 0:31:25 And that video is going to explain, you know, 0:31:27 how quantum computing works, 0:31:29 and you’re gonna be able to put that on YouTube. 0:31:31 And if you’re like the first few people to do it, 0:31:33 you’ll probably do pretty well. 0:31:36 But over time, A, I can generate that myself. 0:31:37 I don’t need to go to YouTube 0:31:40 and find somebody to generate that for me 0:31:41 and then publish it to YouTube. 0:31:45 But B, you’re gonna see YouTube just get so over saturated 0:31:47 with that kind of content. 0:31:49 Content that was just like somebody entered a prompt, 0:31:52 put the output, uploaded to YouTube, right? 0:31:55 There’s going to be a phase where that is happening a lot. 0:31:57 Like, I’m predicting that right now. 0:32:00 Give it like a year and a half. 0:32:02 We’re gonna see so much just trash coming. 0:32:04 It’ll be good value content, 0:32:06 but it’ll be so low effort 0:32:09 that there’s gonna be so much of it. 0:32:12 And that, I think, is what worries me about YouTube. 0:32:15 But I also feel that’s where the potential is 0:32:18 if you wanna be a YouTuber because being that real person 0:32:22 that people can see and relate to becomes more valuable. 0:32:24 There’ll be a window where I don’t think people will realize 0:32:26 that it’s AI generated, right? 0:32:28 – Yeah, like on Facebook, you see it right now. 0:32:29 – On Facebook, yeah. 0:32:31 Facebook’s the exact example I was thinking of. 0:32:33 – Yeah, yeah, on Facebook, you see the older people, 0:32:35 the boomers or whatever, where they’re like, 0:32:36 oh my God, that’s so cute. 0:32:39 Like people are posting like obviously fake photos 0:32:40 of whatever. 0:32:43 I saw it was like a cat snowman. 0:32:44 It was like a snowman. 0:32:46 It looked like a cat. 0:32:47 It was a snowman. 0:32:49 And they were like, oh, you’re amazing. 0:32:51 I can’t believe how talented you are. 0:32:52 I was like, oh my God. 0:32:58 – So CyberGo says, do you feel that there’s a disconnect 0:33:02 between creators in the AI field and the general public? 0:33:05 Are we, as creators, sometimes too disconnected 0:33:07 from the ethical or social issues 0:33:09 that this technology brings? 0:33:11 And the reason I like this question 0:33:14 is ’cause personally, I don’t feel disconnected 0:33:15 from those concerns at all. 0:33:18 Like I sort of live in both bubbles. 0:33:21 Like I live in the AI creator bubble 0:33:23 of everybody making the videos and the images 0:33:26 and using all the large language models 0:33:27 to create cool content. 0:33:31 But I also follow a ton of people that are against AI. 0:33:35 Not the people that are responding to my tweets 0:33:38 with like screw you, AI sucks, right? 0:33:40 But the people that are actually out there 0:33:45 like bringing up really good arguments against AI, right? 0:33:47 There are people out there that are doing it in a way 0:33:49 where they’re not just saying, oh, you like AI? 0:33:51 Well, screw you, I hate you, 0:33:54 and I’m just gonna like cuss at you every time you tweet. 0:33:55 There are people out there that are like, 0:33:58 I see your points, here’s my counter points, right? 0:34:01 And they actually want to healthily debate people 0:34:03 who agree with AI. 0:34:05 And I follow a lot of those types of people as well. 0:34:09 So like in a lot of my content that I put on YouTube, 0:34:13 I like to look at it from both sides of the coin, right? 0:34:15 Like when I’m talking about a new AI video model, 0:34:17 I will always kind of talk about, 0:34:19 this is why I think this is really, really cool, 0:34:23 but also here’s some of the issues I see with this as well, 0:34:24 right? 0:34:27 Like when it comes to tools like mid-journey 0:34:29 and stable diffusion and some of those kinds of things, 0:34:32 I think it’s really, really, really cool technology. 0:34:35 I think it generates some amazing images 0:34:38 and it opens up the floodgates for anybody to be creative 0:34:42 and to bring into this world anything that they can imagine. 0:34:45 And I think that’s super amazing, super empowering. 0:34:49 But at the same time, I also think, yeah, but they scraped, 0:34:51 you know, millions of other people’s images 0:34:52 to train this data set. 0:34:54 And those people didn’t get compensated. 0:34:56 And I can go to some of these tools and say, 0:34:59 generate an image that looks like a Banksy. 0:35:01 I don’t know, I couldn’t think of a better artist 0:35:03 in the moment, but you know, generate an image 0:35:04 that looks like this artist. 0:35:07 And it generates an image that looks just like the work 0:35:08 that that artist would have created. 0:35:13 And I do think that like the ethics of that sort of bother me. 0:35:17 Right? So I’m constantly trying to look at things 0:35:19 from both angles. 0:35:22 I tend to live on the side of technology 0:35:26 is going to progress, whether you people over here 0:35:27 like it or not. 0:35:31 So I’m going to learn about it and I’m going to live with it 0:35:33 and I’m going to use it and I’m going to try to implement it 0:35:38 in my various workflows, but that doesn’t mean 0:35:40 I’m not sensitive to the implications 0:35:42 on the other side of the coin with it. 0:35:47 I just don’t think that there’s a sort of, you know, 0:35:48 rewind button. 0:35:51 I don’t think there’s any going back on this now. 0:35:56 And so like, if I was in that position of, 0:35:58 I don’t like this, so I’m not going to use it. 0:36:01 I feel like I’m putting myself in this like helpless position 0:36:06 of like, I’m going to try to stop this from ever happening, 0:36:10 but the strength that I have is not enough 0:36:11 to stop this moving train. 0:36:13 It’s just going to plow right through me. 0:36:16 So I tend to operate on the other side of the train. 0:36:18 I’m going to ride the train instead of standing on the track 0:36:21 trying to put my hands in front of it and stop it, you know? 0:36:23 – I mean, I think we feel pretty similarly about it. 0:36:28 I mean, I kind of consider myself like a, you know, 0:36:30 effective accelerationist, you know, 0:36:32 definitely a techno optimist, 0:36:34 but sometimes those people get a little too crazy, 0:36:35 like accelerate everything, 0:36:38 don’t care about any of the consequences. 0:36:41 I definitely, I don’t really feel like I’m in that camp. 0:36:43 But I do, I mean, I personally feel that AI 0:36:46 is going to make the world way better. 0:36:47 But the same, and I think a lot of people 0:36:49 have actually not properly thought that through 0:36:53 like what AI will enable for us in the future 0:36:56 in terms of robotics, in terms of scientific breakthroughs, 0:36:58 all kinds of different parts of life 0:37:02 that I think AI is going to make better for us. 0:37:03 But there are going to be some major issues. 0:37:05 Like the main thing I think about 0:37:07 is not really the AI art stuff. 0:37:08 Like I understand that argument too, 0:37:11 but I don’t find as much interest and think about that. 0:37:13 I think more about the job displacement, 0:37:15 I think really is the thing that I think more about. 0:37:17 You know, actually I did a speech, 0:37:18 kind of talking about something similar to this 0:37:21 at Stanford several years back, 0:37:22 talking about, you know, 0:37:25 how I grew up in a small town in Alabama. 0:37:27 And yeah, I love technology 0:37:28 and technology usually makes things better. 0:37:30 But in my small town in Alabama, 0:37:32 like as soon as like all the factories were gone, 0:37:34 went to other places, you know, 0:37:35 life changed dramatically. 0:37:37 And this happens throughout history. 0:37:40 And it’s like the adjustment is hard for people. 0:37:43 And it does have a material impact on people’s lives. 0:37:44 So I do feel a lot of sympathy for that. 0:37:47 And I do think major changes are happening, you know, 0:37:49 in my newsletter, like on Lord.com, 0:37:50 that’s what I try to talk about a lot is like, 0:37:52 how can I help people thrive in the age of AI? 0:37:54 ‘Cause that’s all you can do. 0:37:56 Like be as positive about this as you can 0:37:58 and try to figure out how you’re going to, you know, 0:38:00 ride this wave because there’s no stopping it. 0:38:02 Like there’s no way it doesn’t matter 0:38:03 if you’re on the left or right, whatever. 0:38:06 No politicians, if they’re smart, are going to be against this 0:38:09 because it is what’s going to help America be a leader 0:38:10 in the future economy, you know, 0:38:13 just like how we were the winners in the internet. 0:38:15 And that enabled our entire economy 0:38:17 to continue growing like it has. 0:38:19 And the same with entertainment and weapons 0:38:22 and other things in the past, computers, everything else. 0:38:23 It’s the same thing with AI. 0:38:24 Like we have to win at this. 0:38:26 So there’s no, you know, 0:38:27 no matter what anyone feels about it, 0:38:29 there’s not going to be stopping AI. 0:38:31 It’s going to continue getting better. 0:38:34 And so, yeah, so I do think a lot about job displacement. 0:38:36 And I don’t think we have great answers to that. 0:38:38 Like there’s going to be a lot of new opportunities, 0:38:41 but there are going to be major job losses, like major, so. 0:38:46 – Yeah, I tend to be optimistic about the abilities of humans 0:38:51 and their ingenuity and their ability to figure out, 0:38:53 you know, what value they can bring next. 0:38:56 You know, maybe the value isn’t being the guy 0:38:59 that enters data into a spreadsheet 0:39:00 and files your taxes for you. 0:39:03 Maybe the value isn’t the guy that, you know, 0:39:07 goes through all of this previous case law for you 0:39:10 and, you know, helps you fight an argument in court. 0:39:12 Maybe some of that kind of stuff goes away, 0:39:15 but I do think humans will find new ways 0:39:17 to add value to the world. 0:39:22 I mean, the people who sold ice blocks hated it 0:39:24 when the refrigerator came out, right? 0:39:29 The people who were painters hated it when cameras came out. 0:39:33 The people who were hardcore enthusiasts or photographers 0:39:34 hated it when Photoshop came out. 0:39:36 Anybody can manipulate photos. 0:39:39 Like this story is a tale as old as time, 0:39:42 but humans have always continued to figure out how to thrive 0:39:45 and figure out how to like move to the next thing. 0:39:49 And they’ve always figured out how to create new value 0:39:51 in the world when the old way they created value 0:39:54 ceased to be a way to create value. 0:39:56 And I think it’s going to continue that way. 0:39:59 And I choose to be optimistic about that 0:40:03 because I don’t see the value in being pessimistic 0:40:05 and feeling like, oh no, the sky is falling. 0:40:06 We’re all doomed. 0:40:09 Like you can choose which side you want to think, 0:40:11 which side you want to put your brain energy towards. 0:40:13 And I’m going to put it towards the optimistic side 0:40:15 because the other side just feels like hell to me. 0:40:17 – Right, right. 0:40:20 And I do feel like the AI may actually help here too, right? 0:40:22 People are not realizing it. 0:40:24 But like, imagine like, yeah, in the future, 0:40:25 and someone’s going to be like, oh my God, 0:40:26 yeah, AI is going to help. 0:40:28 It took my job and then it’s going to help me. 0:40:30 But there probably will be scenarios where like people 0:40:33 lose their jobs and have to rethink their lives. 0:40:35 And then they’re like, and then AI is way better than them. 0:40:37 They’re like literally chatting with AI. 0:40:40 And the AI like, okay, you’re good at this kind of stuff. 0:40:40 You’re not good at this. 0:40:42 It kind of like actually learns about 0:40:43 what you’re actually good at 0:40:44 ’cause everyone has different kinds 0:40:45 of things they’re good at, right? 0:40:47 And it actually learns what you’re good at 0:40:49 and it helps you put together a plan 0:40:50 of what you should do next. 0:40:51 I think that kind of stuff is going to happen 0:40:54 where people start entirely new careers, 0:40:56 new side businesses, whatever, 0:40:58 just because the AI helped coach them to do it. 0:41:00 And then probably had AI agents 0:41:03 to help them even execute on some of the work, right? 0:41:04 So I think you’re going to see a lot of that 0:41:07 where people before who could not do a business, 0:41:09 maybe because all the legal stuff was annoying 0:41:11 or accounting or whatever, 0:41:13 stuff that they were not interested in, 0:41:15 now AI is going to help them do that. 0:41:16 So I think you’re going to see 0:41:18 a lot of new opportunities for people as well. 0:41:21 If you lose your job, here’s the game plan. 0:41:22 – Just talk to ChadGPT. 0:41:23 – No, here’s the game plan. 0:41:27 If you lose your job, take your entire life savings, 0:41:28 take out a second on your house 0:41:30 or use credit cards or whatever. 0:41:35 Go buy as many cyber robo taxis as you can from Tesla 0:41:37 and then just rent them all out 0:41:40 and then you get income from these taxis 0:41:41 just driving people around. 0:41:42 There you go. 0:41:43 – Yeah. 0:41:45 – The value that people will add to the world 0:41:47 may just be the ownership. 0:41:50 – I am joking, but that is one of the ways. 0:41:51 – I thought you were going to get in video stock. 0:41:52 I thought you were going to get in video stock. 0:41:54 – Yeah, just invest in video right now. 0:41:57 – Sell everything you’ve got. 0:41:58 – Joking. 0:42:00 – Quite honestly, joking aside, 0:42:03 I do think people like Elon, love him or hate him, 0:42:05 I know a lot of people hate him. 0:42:06 Some people are probably going to say 0:42:08 I’m never going to tune into this podcast again 0:42:09 because Matt mentioned to Elon. 0:42:11 I’ve gotten those kind of comments on my YouTube videos, 0:42:13 but love him or hate him, 0:42:16 he is trying to create additional revenue streams 0:42:18 that don’t require labor. 0:42:20 Like if you look at what they’re doing with the RoboCab, 0:42:21 he’s basically saying, 0:42:24 and I think his numbers and his timelines are way off 0:42:26 ’cause they almost always are with Elon, 0:42:27 but he’s basically saying, 0:42:31 you can go buy one of these RoboCabs for $30,000. 0:42:33 It will take you around wherever you need it 0:42:36 to take you around and when you’re not using it, 0:42:38 it will autonomously go and tax the other people around 0:42:41 and you get some of the revenue from that, right? 0:42:44 So like, I think maybe in the future, 0:42:47 like that’s the kind of revenue streams 0:42:48 people are going to be generating. 0:42:50 Now, probably not the best example 0:42:51 because you’ve already got, 0:42:52 you’ve got to be able to afford one of those 0:42:55 to sort of get your feet in the door on something like that. 0:42:58 But I do think like new revenue generation models 0:43:01 are going to pop up, like ignore the RoboCab thing. 0:43:03 I think new revenue generation models 0:43:05 that people haven’t even thought of yet 0:43:10 are going to pop up and replace some of the more laborious 0:43:13 tasks, right? Whether it be, you know, 0:43:17 the blue-collar mining will be done by robots 0:43:21 or the, you know, Excel spreadsheet accountant type stuff 0:43:24 is going to be done by, you know, AIs, right? 0:43:26 I think a lot of that stuff is going to get replaced, 0:43:28 but new stuff will bubble up 0:43:31 that is going to allow you to provide value to the world. 0:43:32 – Yeah, and on the other side too, 0:43:33 a lot of the stuff he’s building 0:43:37 will result in abundance and the entire idea of like, 0:43:38 our economy is driven like based on like, 0:43:40 abundant scarcity, like what’s, you know, 0:43:45 supply and demand, like if it’s cheaper to produce things, 0:43:46 the cost will go down. 0:43:49 And so over time, the cost of living will actually go down 0:43:50 because of these inventions. 0:43:52 And people are not realizing that, like, 0:43:54 when you have robots out there building all this stuff for us 0:43:56 and it’s cheap to make them, 0:43:57 the cost of everything is going to go down. 0:43:59 – Awesome. Well, I do think that’s probably a good place 0:44:01 to wrap this one up. 0:44:03 I think, you know, we went off on some tangents 0:44:06 and some rants and went sort of deep and theoretical 0:44:07 with a lot of these questions. 0:44:09 So we actually only got through maybe like, 0:44:11 25% of the questions that were asked, 0:44:12 but we are going to save this thread. 0:44:14 We are going to do more of these like, 0:44:17 ask us anything kind of episodes in the future. 0:44:18 They’re a lot of fun. 0:44:22 We love just sort of riffing on whatever anybody wants us 0:44:23 to talk about. 0:44:25 We like, for me, I know that’s like one of my sweet spots. 0:44:28 I love not knowing where the conversation’s going to go. 0:44:29 So that’s a lot of fun. 0:44:31 We are going to save any of the questions that we missed 0:44:33 and circle back around to some of our favorites 0:44:35 in a future episode. 0:44:36 So thank you so much to everybody 0:44:39 who did ask your questions over on X, 0:44:41 over on LinkedIn, over on threads. 0:44:42 We will be doing more of these. 0:44:44 We really, really appreciate you. 0:44:49 But that being said, I think this is a wrap on this episode. 0:44:51 So thank you so much for tuning into this one. 0:44:55 If you enjoyed this episode, make sure you subscribe to us. 0:44:59 Either subscribe over on YouTube if you want all the visuals 0:45:01 and you want to look at me and Nathan’s beautiful faces 0:45:03 as we talk about this stuff. 0:45:06 If you really, really don’t like looking at our faces, 0:45:09 go subscribe wherever you subscribe to podcasts. 0:45:11 We’re on Spotify, Apple, all the places. 0:45:14 Come tune in and subscribe wherever you listen to podcasts. 0:45:16 And thanks again for tuning in. 0:45:17 Thank you. 0:45:20 (upbeat music) 0:45:22 (upbeat music) 0:45:25 (upbeat music) 0:45:28 (upbeat music) 0:45:30 (upbeat music) 0:45:33 (dramatic music) 0:45:35 you
Episode 30: How are AI tools really transforming our productivity? Matt Wolfe (https://x.com/mreflow) and Nathan Lands (https://x.com/NathanLands) dive deep into the world of AI-driven workflows in their latest Q&A Special episode. No guest joins this episode, ensuring that our beloved hosts can thoroughly dissect the impact of these smart tools on their workflow.
In this episode, Matt and Nathan discuss several AI tools, addressing market saturation, the affordability and accessibility of advanced models, and intriguing business opportunities. Matt shares how Spotter helps generate video thumbnail concepts, and both hosts discuss the future of content creation, AI’s ethical concerns, and technological advancements on the horizon. Expect insights into their favorite AI tools, the evolving AI market, and the balance between automation and human authenticity.
Check out The Next Wave YouTube Channel if you want to see Matt and Nathan on screen: https://lnk.to/thenextwavepd
—
Show Notes:
(00:00) New o1 model shows potential with agents.
(04:13) Email management agents automate responses and organization.
(09:09) Perplexity, Claude, NotebookLM aid research efficiently.
(11:35) Using various tools for YouTube content optimization.
(14:40) Modify AI thumbnail with Photoshop’s generative fill.
(18:33) AI could widen or narrow wealth disparities globally.
(20:18) AGI is farther out, tool consolidation coming.
(25:46) Quick response preferred over complex inference.
(28:14) AI videos will evolve VTubers but remain niche.
(32:38) Engages both support and critique of AI.
(35:01) AI improves world, poses job-related challenges.
(37:22) Humans adapt and create new value continually.
(40:38) Elon aims for autonomous robo-taxi revenue.
The Next Wave is a HubSpot Original Podcast // Brought to you by The HubSpot Podcast Network // Production by Darren Clarke // Editing by Ezra Bakker Trupiano