These AI Workflows 10x’d Our Productivity (Q&A Special)

AI transcript
0:00:02 We’ve been getting a ton of questions from you guys
0:00:03 over on social media.
0:00:05 You know, how does this all play out?
0:00:08 What does this all look like in the future?
0:00:11 That sort of concept came up over and over and over again.
0:00:13 I’m hoping in the future it’s just simplified.
0:00:15 Here’s the one model.
0:00:17 We’re going to answer a lot of your questions.
0:00:22 Hey, welcome to the Next Wave Podcast.
0:00:23 I’m Matt Wolf.
0:00:24 I’m here with Nathan Lanz.
0:00:28 On this episode, we’re going to answer a lot of your questions.
0:00:31 And we deep dive into some really fun topics.
0:00:33 We’re going to list out a ton of tools
0:00:35 that we’re using in our own businesses, the tools
0:00:37 that we couldn’t live without.
0:00:39 We’re going to talk about the future of work
0:00:42 and what happens if AI takes all of our jobs?
0:00:44 Where do we go from there?
0:00:46 We’re going to talk about large language models
0:00:50 and what we see as the future of large language models
0:00:51 and so much more.
0:00:53 Get a notepad ready.
0:00:54 We go into a lot of stuff.
0:00:57 This is a fun episode, so let’s just go ahead and get right
0:01:00 into it.
0:01:03 When all your marketing team does is put out fires,
0:01:04 they burn out.
0:01:07 But with HubSpot, they can achieve their best results
0:01:08 without the stress.
0:01:11 Tap into HubSpot’s collection of AI tools,
0:01:14 Breeze, to pinpoint leads, capture attention,
0:01:17 and access all your data in one place.
0:01:19 Keep your marketers cool and your campaign results
0:01:20 hotter than ever.
0:01:24 Visit hubspot.com/marketers to learn more.
0:01:29 [MUSIC PLAYING]
0:01:31 So our first question is from nothead.ai.
0:01:35 So he says, where do you think we will see the next best
0:01:37 leap in AI agents?
0:01:39 Yeah, because we’ve heard about agents for so long,
0:01:41 but like nothing’s actually worked yet.
0:01:43 Like there was all the hype, you know, with BabyAGI
0:01:45 and all those that came out and AutoGPT,
0:01:47 what was it, like almost like a year and a half ago.
0:01:48 And then really nothing happened.
0:01:50 So I think a lot of people were really disappointed
0:01:53 that there was all that hype, but the rumor
0:01:55 is that OpenAI has been telling their investors
0:01:59 that the new ’01 model, like not the preview that’s out
0:02:01 right now, but the actual ’01, that they’re
0:02:03 having some like pretty good results with agents.
0:02:06 And so I think currently that’s what I’m betting on
0:02:09 is like, you know, if they’re telling their investors that,
0:02:11 like usually you don’t lie to investors.
0:02:13 So if they’re actually telling investors that,
0:02:15 that probably means it’s at least working somewhat.
0:02:16 So I think that’s going to be the next step.
0:02:17 It’s probably not going to be amazing.
0:02:19 It’s going to be like a lot of these things
0:02:21 where sometimes they get overhyped,
0:02:23 but at least if they’re useful in some use cases,
0:02:25 then you just kind of, it’ll get better from there.
0:02:27 – Well, I know like the Rabbit R1, right?
0:02:29 Which was just like horribly reviewed
0:02:31 by everybody who got their hands on it.
0:02:35 Well, they just now started to roll out the large action model.
0:02:39 And supposedly it’s pretty decent now.
0:02:42 Like it can actually watch things on your screen.
0:02:44 So you train it once on how to do something.
0:02:46 Like you can train it on how to go buy something for me
0:02:47 on Amazon.
0:02:50 Once you’ve done it once, it sort of learns how to do that.
0:02:52 And then next time you can say,
0:02:57 hey, go buy me a new water bottle on Amazon or whatever.
0:03:01 And it will go and actually go through all the steps.
0:03:02 I actually have a rabbit.
0:03:03 I haven’t tried that yet,
0:03:06 but I hear they’re actually making some good strides there
0:03:09 on the rabbit, but it’s so hard to say.
0:03:11 Like I feel like everybody’s vision
0:03:14 of what an AI agent is going to be
0:03:16 is like slightly different.
0:03:17 Like we’ve already got perplexity,
0:03:19 which is kind of like an AI agent
0:03:22 where it will search out one query based on what it finds,
0:03:23 search out another query,
0:03:26 go and possibly even a third query
0:03:28 and then present you with all the information
0:03:29 that it came up with, right?
0:03:30 We’ve already got that.
0:03:33 You can already create like semi AI agents
0:03:36 with things like make.com and Zapier.
0:03:41 And there’s a tool called MindStudio and agents.ai
0:03:45 right is the one from Darmesh over at HubSpot.
0:03:47 You’ve got all of these tools
0:03:50 that are kind of agentic already, right?
0:03:53 You’re basically having the AI go and use this tool
0:03:56 via an API and then you can connect all of these APIs
0:03:59 and all of these AIs together
0:04:01 and get what you’re looking for.
0:04:03 But they’re still kind of like convoluted
0:04:06 and complex and sort of tough to build.
0:04:10 But I feel like AI agents is sort of like a tough thing
0:04:13 to really predict where it’s going to end up
0:04:15 because everybody kind of needs them
0:04:17 to do different things for them, right?
0:04:19 So I don’t know if there’s going to be
0:04:21 at least in the very, very near term
0:04:23 like a one size fits all AI agent
0:04:25 where people are like, this is it.
0:04:28 We got the AI agent that everybody was looking for.
0:04:29 I think we’re gonna have like a whole bunch
0:04:33 of little sparks of agents all over the place
0:04:35 that all do little different things.
0:04:37 And we’re probably still a little ways off
0:04:39 before it’s like a personal assistant
0:04:42 where you can just tell it to do anything you want, you know?
0:04:43 – Yeah, I mean, I think you’re gonna see lots
0:04:44 of different ones where it’s just
0:04:46 for like a specific use case.
0:04:47 Like you just said, I saw a tweet
0:04:49 from Dan Schipper the other day
0:04:51 where he’s got something they’re gonna be releasing
0:04:52 where it looks like it’s an agent
0:04:53 that like goes through all your emails
0:04:55 and like responds to the ones
0:04:57 that are like obviously just needed very simple responses
0:04:58 and you kind of programmed it.
0:05:02 What kind of simple responses are okay for it to give?
0:05:03 And then also like, you know,
0:05:04 probably checks the ones that are spam
0:05:06 and the ones that you don’t need to respond to,
0:05:07 the ones are important.
0:05:08 And it summarizes all of it for you
0:05:09 and helps take care of them.
0:05:11 So I can see that we’re gonna have all these different agents.
0:05:12 – Does that exist already?
0:05:14 Is that something that Dan’s building?
0:05:15 – Well, he showed a screenshot of it.
0:05:17 Like basically like that it’s working internally
0:05:19 and he like kind of said, like something’s coming soon.
0:05:21 So, and you know, I can imagine that
0:05:24 with something like agent.ai, what Darmesh is doing.
0:05:25 That makes a lot of sense.
0:05:26 I hope I have some kind of directory
0:05:28 or something like this where it’s like,
0:05:29 here’s an agent for your email.
0:05:32 Here’s an agent for like sales outreach.
0:05:33 Here’s an agent for whatever.
0:05:36 But it’s probably, it probably is several years away
0:05:38 before we have like an all, one that does all of it.
0:05:40 Like maybe that’s like five years away.
0:05:41 – Yeah, yeah.
0:05:43 I mean, I’ve already started building like little things
0:05:45 like the email one.
0:05:47 Like you can use a tool like make.com
0:05:52 and you can use folders or tags inside of Gmail
0:05:55 and basically have make.com watch for any time
0:05:57 an email goes into a specific folder.
0:06:00 And if you put an email in a specific folder,
0:06:03 then make.com follows through on a set of actions for you,
0:06:04 right?
0:06:06 So let’s say like, for example, you get an email
0:06:09 that you’re like, okay, this one is something
0:06:10 I don’t personally need to respond to
0:06:13 is something that I respond to often.
0:06:14 So I have sort of like a stock response
0:06:17 that I send every time I can throw it into a folder
0:06:20 that’s like use one of my stock responses, right?
0:06:23 And then make.com could look in that folder
0:06:26 whenever a new email comes in, read the email
0:06:29 and then sort of decide how to reply based on like a handful
0:06:31 of potential set responses.
0:06:34 And it could save like a whole bunch of time with email, right?
0:06:37 So there’s already like stuff like that that you can do
0:06:41 but I just feel like the definition of an AI agent
0:06:44 or what the expectation of an AI agent is,
0:06:46 it’s sort of a moving goalpost, right?
0:06:50 It’s like AGI, like nobody sort of agrees on what AGI is.
0:06:53 Well, an agent, like the agent that I want
0:06:55 to help me in my business is probably a little bit different
0:06:57 than the agent you want to help you in your business.
0:06:59 It’s probably a little bit different than the agent
0:07:03 that an airline pilot wants for their career, right?
0:07:06 Like everybody wants slightly different things.
0:07:08 And I think it’s the nuance
0:07:10 that makes it sort of complicated right now.
0:07:13 – Yeah, I think OpenAI said that level three,
0:07:15 they’re trying to have like different levels of AGI
0:07:18 ’cause yeah, like defining AGI is actually quite hard.
0:07:20 Like everyone has different ideas of what it means.
0:07:22 I think they put level three as agents
0:07:24 and level two is reasoning and that makes sense
0:07:26 ’cause like that’s the reason the agents didn’t work
0:07:28 is ’cause agents require some level of reasoning.
0:07:30 You know, you give them some kind of tasks,
0:07:32 they go off and get confused like maybe AGI
0:07:34 and the other ones got confused.
0:07:35 And so they have to have some kind of reasoning
0:07:36 to actually get past that confusion
0:07:38 and figure out what to do next.
0:07:41 So if they have nailed reasoning with 01, if they really have,
0:07:42 then I think we will start to see
0:07:44 some really cool useful agents.
0:07:46 Ones that actually go off and do stuff for you.
0:07:46 – For sure.
0:07:50 So the next one is from the Jacob Gooden.
0:07:51 He’s actually a buddy of mine.
0:07:53 He used to be the producer of my old show,
0:07:54 Hustlin Flowchart.
0:07:55 Still is the producer of Hustlin Flowchart.
0:07:57 I’m not on that show anymore.
0:07:58 But he asked a good question here.
0:08:00 What is your go-to AI tool?
0:08:03 The thing you use every day and would miss it
0:08:04 if it went away tomorrow?
0:08:06 – Yeah, so mine has become,
0:08:07 I know yours is probably perplexity.
0:08:09 Is that, is that?
0:08:10 – Mine would be two.
0:08:11 Like I think there’s two, right?
0:08:12 Perplexity and Claude.
0:08:16 Those are the two that I use like every single day.
0:08:20 – Yeah, so mine has become the ChatsPT voice
0:08:21 or whatever they’re calling.
0:08:22 I’m just gonna call it ChatsPT voice.
0:08:24 I know they’ve got different names for all their stuff.
0:08:27 But I’m still finding a ton of value of that.
0:08:29 Like talking to it every single day.
0:08:30 I use it for personal stuff.
0:08:32 Like what am I doing today?
0:08:33 What’s my schedule?
0:08:36 Or just things I’m thinking through to take notes.
0:08:37 I use it like that in the morning.
0:08:41 I’ve been using it to help translate with my wife.
0:08:43 And also my son has a ton of fun using it.
0:08:47 It’s been like a really cool way to explain AI to him
0:08:49 and show him what’s possible.
0:08:50 And also perplexity.
0:08:52 I’ve been using perplexity more and more.
0:08:53 Like as I said,
0:08:56 I set it to my default search engine in my browser.
0:08:58 And so when I type in something in the browser now,
0:09:00 it just pops up in perplexity.
0:09:02 And 90% of the time I’m finding that that’s better
0:09:05 than the Google results, honestly.
0:09:08 And then also they just keep releasing,
0:09:09 they’re releasing new things so fast.
0:09:12 They’re like, people will suggest things on Twitter.
0:09:14 And then Aravind, the CEO will see it and respond to it
0:09:16 and say, cool idea or whatever.
0:09:17 And then like a week later,
0:09:19 you’ll see him like post a screenshot
0:09:22 or a link to the thing, right?
0:09:24 – Yeah, I mean, going back to the original question,
0:09:27 I think, yeah, for me, it’s perplexity and clot.
0:09:29 I mean, perplexity for all the reasons you just mentioned.
0:09:31 I love it for research, right?
0:09:32 I love going in there and saying,
0:09:35 hey, I’m trying to deep dive on this topic.
0:09:36 What can you find for me?
0:09:37 And then it’s really good
0:09:39 at suggesting follow-up questions too, right?
0:09:42 Like it even kind of takes the thinking out of,
0:09:44 well, what should I ask next to learn more about this?
0:09:46 ‘Cause it gives you like five potential questions
0:09:47 to ask next.
0:09:49 So I really like just sort of going down
0:09:51 a perplexity rabbit hole.
0:09:54 And I like, I use perplexity all the time,
0:09:56 like constantly to sort of research topics.
0:09:59 And then Claude really, really helps me dial in my videos,
0:10:02 right? Especially the short form videos.
0:10:05 Notebook LM has been really, really cool.
0:10:06 I love plugging in stuff into that
0:10:10 and like listening to a podcast back about a topic.
0:10:13 But yeah, if I had to, if there was one tool
0:10:15 that was like, if this was gone tomorrow,
0:10:16 I’d be really, really bummed out.
0:10:19 It’d probably be perplexity would be first place.
0:10:21 Claude would be second place.
0:10:24 – Yeah, I finally tried Notebook LM
0:10:25 with my son over the weekend too.
0:10:28 So we tried to rep lit and then we tried Notebook LM.
0:10:30 Same thing where we were using chat GPD voice,
0:10:33 like chatting with it, getting ideas.
0:10:36 And then just typing it right into Notebook LM.
0:10:37 – There was a video going around,
0:10:38 I don’t know if it was on Twitter or something,
0:10:40 but there was a video going around
0:10:41 where somebody made a text file
0:10:45 and they just put the word poop into it like 2,000 times.
0:10:48 And then they uploaded it into Notebook LM.
0:10:50 And like, they made a whole 10 minute podcast
0:10:53 about the document that just had the word poop
0:10:56 posted into it like 10,000 times, right?
0:10:57 And they’re just like, you know,
0:10:59 we’ve talked about a lot of things on this podcast,
0:11:01 a lot of really interesting things,
0:11:03 a lot of non-interesting things.
0:11:05 And today I think we’ve got the most interesting document
0:11:06 we’ve ever seen.
0:11:08 This document is just the word poop
0:11:10 over and over and over again.
0:11:11 – You know, it sounds ridiculous,
0:11:13 but like me and my son had so much fun with it,
0:11:14 like so much fun.
0:11:16 It’s like, God, that’s like, you know,
0:11:18 we’ve talked about like generative entertainment
0:11:20 and stuff in the past and past episodes.
0:11:23 It’s like, but that’s like one of the first examples
0:11:25 of like generative entertainment, right?
0:11:27 Or it’s like, yeah, in the future,
0:11:28 you’re gonna have more and more stuff like this
0:11:31 where you just like generate the stuff that you enjoy,
0:11:32 you know, ’cause everyone’s different.
0:11:34 And like there’s stuff that I find funny,
0:11:36 those people are like, God, that’s stupid.
0:11:40 – Yeah, all right, so moving over to Twitter slash X now,
0:11:43 Railia says, have you found any tools
0:11:44 that are great for coming up
0:11:47 with YouTube video ideas slash titles?
0:11:49 I know a couple of tools, vidIQ,
0:11:51 Spotter Studio, Creator ML,
0:11:53 would be curious to know what else you have played with
0:11:54 and what is working well.
0:11:57 So I feel like this question’s probably directed at me,
0:12:00 but I mostly still use Claude for a lot of this stuff.
0:12:01 I do have a vidIQ account.
0:12:02 I do have a Spotter account.
0:12:04 I do have a TubeBuddy account.
0:12:06 I have like all of those tools,
0:12:09 but most of those are like,
0:12:11 I use Spotter to help me come up
0:12:14 with thumbnail ideas mostly, right?
0:12:19 I use TubeBuddy and vidIQ more for like data analysis
0:12:22 and to watch the stats and to see it
0:12:24 and to like test thumbnails and things like that.
0:12:27 I still use Claude to help me come up with thumbnails
0:12:30 or with titles and like hooks and stuff like that, right?
0:12:31 Like I’ll go to Claude and say,
0:12:34 hey, I want to make a video about X, Y and Z,
0:12:37 help me come up with a good title for this video.
0:12:39 Or what I do a lot of times now
0:12:40 is I’ll record the whole video,
0:12:42 get the transcript from the video, throw it into Claude
0:12:45 and say, here’s a transcript from a video I recently made,
0:12:47 help me come up with a title for it, right?
0:12:52 So I still kind of use the bare bones tools
0:12:54 that aren’t actually designed for YouTube
0:12:57 because a lot of these tools that were designed for YouTube
0:13:01 are just using things like OpenAI or Claude or Llama
0:13:03 or one of these models underneath anyway.
0:13:06 So I’m just sort of like skipping the middle man, honestly.
0:13:10 – We’ll be right back.
0:13:12 But first, I want to tell you about another great podcast
0:13:13 you’re going to want to listen to.
0:13:15 It’s called Science of Scaling,
0:13:17 hosted by Mark Roberge.
0:13:19 And it’s brought to you by the HubSpot Podcast Network,
0:13:23 the audio destination for business professionals.
0:13:25 Each week, host Mark Roberge,
0:13:27 founding chief revenue officer at HubSpot,
0:13:29 senior lecturer at Harvard Business School
0:13:32 and co-founder of Stage 2 Capital,
0:13:35 sits down with the most successful sales leaders in tech
0:13:37 to learn the secrets, strategies and tactics
0:13:40 to scaling your company’s growth.
0:13:42 He recently did a great episode called,
0:13:45 how do you solve for a siloed marketing in sales?
0:13:47 And I personally learned a lot from it.
0:13:49 You’re going to want to check out the podcast,
0:13:50 listen to Science of Scaling,
0:13:53 wherever you get your podcasts.
0:13:57 But like, may not cue you,
0:13:59 what do you use it mainly for like stats on videos
0:14:00 and things like that?
0:14:01 – So may not cue you, I mostly use,
0:14:03 ’cause they’ve got like a really nice like sidebar
0:14:05 that automatically shows up on YouTube
0:14:07 and it’ll show you if like thumbnails have been recently
0:14:09 changed or titles have been recently changed.
0:14:13 So it’s more to like analyze other videos on other channels
0:14:16 than it is for like my own channel, honestly.
0:14:17 I can go look at other videos
0:14:19 and see how well they’re performing
0:14:21 compared to like their normal videos.
0:14:22 It also puts like a little,
0:14:24 like a multiplier below the video.
0:14:27 So it’ll say this video is performing at like 1.5x,
0:14:29 the normal video on this channel.
0:14:31 This video is performing at 50x the normal video.
0:14:34 This video is performing at 0.5x,
0:14:36 like it’s underperforming for this channel.
0:14:39 So I use it a lot like that to see what types of videos
0:14:40 and titles and thumbnail combinations
0:14:42 are working for other people.
0:14:45 Just to kind of get ideas and stay looped in.
0:14:47 I don’t really copy other people’s ideas,
0:14:50 but like it’s more keeping my finger on the pulse.
0:14:54 Saying that TubeBuddy is an AI first company.
0:14:55 Like they’re actually,
0:14:57 they actually got purchased by a company called Ben Labs
0:15:01 and Ben Labs was a giant like AI research company.
0:15:05 So like TubeBuddy is sort of like built around AI these days.
0:15:09 But yeah, I don’t really know the history
0:15:11 too much of vidIQ or anything like that.
0:15:13 And Spotter actually is really, really good
0:15:15 at generating thumbnail concepts.
0:15:18 You can give it an idea for a video and it’ll use,
0:15:20 I don’t know what AI models are using under the hood,
0:15:22 but it will generate like thumbnails
0:15:26 based on what your channel thumbnails normally look like.
0:15:29 So I will go and I will give it an idea for a video.
0:15:31 It’ll make a thumbnail that looks similar
0:15:33 to like my most popular thumbnails,
0:15:35 but with like new elements in it.
0:15:37 So it actually is sort of trained in
0:15:39 on my existing thumbnails on my channel.
0:15:43 Like it even tries to make like a sort of face
0:15:45 that looks like mine, not really good,
0:15:48 but it’ll make a like a bearded man
0:15:50 that looks somewhat close to me in the thumbnail
0:15:53 just to give you the concept, you know?
0:15:54 – Right.
0:15:54 That’s cool.
0:15:56 Are you actually using that, the Spotter?
0:15:58 Or is that just giving you ideas?
0:15:58 That’s so cool.
0:16:00 – Yeah, well, I don’t actually take
0:16:02 the thumbnail straight from Spotter.
0:16:04 In fact, the thumbnail that it generates for you,
0:16:07 it actually has text on it that says like generated with AI.
0:16:10 Right? So like, I can’t just take the thumbnail straight
0:16:12 from Spotter and upload it to YouTube,
0:16:14 but it’ll give you a concept.
0:16:18 And I can take that concept and use it as like an image
0:16:22 to image inside of stable diffusion
0:16:24 and have stable diffusion generate something
0:16:25 that looks similar.
0:16:29 Or what I’ve done, if I really, really like the thumbnail
0:16:32 it generated for me is I take the thumbnail it generated,
0:16:35 pull it into Photoshop, put a little square around the area
0:16:37 where it says generated with AI.
0:16:40 And then use generative fill to just remove it.
0:16:42 – So we just discovered a business idea
0:16:43 for anyone listening.
0:16:45 (laughing)
0:16:47 You literally could just go copy Spotter
0:16:51 and not say made with AI and have a business right there.
0:16:51 So.
0:16:53 – Yeah, maybe.
0:16:56 They have some like proprietary stuff behind the scenes
0:16:59 to like actually learn on your channel
0:16:59 and stuff like that.
0:17:02 – Yeah, not that simple, but if you, yeah,
0:17:03 someone’s smart enough.
0:17:04 – Yeah, yeah, yeah, for sure.
0:17:09 Let’s see, coal mine canary says you should address
0:17:12 the topic of there being too many AI tools
0:17:14 and people’s inability to afford them.
0:17:16 It’s going to be a problem.
0:17:20 Now that specific question kind of concept
0:17:23 came up a lot on my, this like Twitter thread.
0:17:26 So when I asked you to ask these questions,
0:17:28 the thought of there’s way too many tools,
0:17:31 they all want like a monthly fee to use them.
0:17:33 You know, how does this all play out?
0:17:36 Like what does this all look like in the future?
0:17:40 That sort of concept came up over and over and over again.
0:17:41 And I personally think it’s a problem, right?
0:17:44 Like as the guy who runs future tools
0:17:46 where people are like submitting tools to me every day
0:17:49 and I’m seeing like a hundred new tools every day,
0:17:52 I only approve like 1% of the tools these days
0:17:53 that get submitted to me.
0:17:56 Because so many of them do the exact same thing
0:17:58 and so many of them just feel like
0:18:00 cheap low effort money grabs, right?
0:18:05 People will go and go, oh, I can use the flux,
0:18:07 you know, flux pro API.
0:18:09 Cool, I’m going to go make an AI image generator
0:18:11 and charge 20 bucks a month for it, right?
0:18:13 And all they’re doing is putting a wrapper
0:18:14 around the flux API.
0:18:16 And I just see so much of that
0:18:19 and it’s just low quality and it’s junk, right?
0:18:21 So here’s what are your thoughts
0:18:24 about the like oversaturation issue?
0:18:25 – Yeah, I guess I have a lot of thoughts.
0:18:26 I mean, it feels like right now
0:18:27 we’re in like this exploratory phase
0:18:29 where it’s good that that’s happening
0:18:31 ’cause people are out there trying all these things
0:18:33 so we get to explore all the possibilities
0:18:36 before we set it, you know, before things stabilize
0:18:37 and it’s like, oh, here are like the five things
0:18:39 that everyone uses, right?
0:18:40 ‘Cause over time, that probably will happen.
0:18:42 Like, you know, people say they hate monopolies
0:18:44 but that is one of the benefits of a monopoly.
0:18:46 You’ll have less, you’ll have more things
0:18:48 baked into one product over time
0:18:50 and then you’ll pay like one fee for that.
0:18:51 And I think we will have that.
0:18:53 Like I think in like three years from now
0:18:55 you’ll probably have like five things that people pay for
0:18:57 and most people will probably pay for one or two things,
0:18:58 would be my guess.
0:18:59 – Or you don’t pay for any of them
0:19:01 and they’re ad supported, right?
0:19:04 – Yeah, but, you know, there’s something else though
0:19:06 that I’ve been thinking about recently
0:19:08 ’cause like there’s been like, you know, whispers
0:19:11 that like the new, the models from opening AI
0:19:14 in the future, the ones that are gonna be supposedly
0:19:16 amazing and like people are gonna think it’s AGI.
0:19:20 Why do people assume that that’s gonna be $25 a month
0:19:20 or whatever?
0:19:22 And they may not be.
0:19:25 Like some of these future models may be very expensive.
0:19:27 Like we may start to have a divide there where it’s like,
0:19:29 yeah, there’s like the $25 a model
0:19:31 and you’re getting what you get now
0:19:34 or you pay like a thousand or 10,000 a month
0:19:36 and you get access to AGI.
0:19:39 And ’cause it may cost a lot to run these future models
0:19:40 and they may be absolutely amazing
0:19:42 and very expensive to run.
0:19:45 So that’s kind of, that’s where I’m actually kind of concerned.
0:19:47 – If you pay 10,000 a month for AGI
0:19:50 can I go tell my AGI to go make me a business
0:19:52 that makes more than 10,000 a month?
0:19:54 – Well, yeah, that’s the thing is like, yeah,
0:19:56 this may literally, you know, ’cause we’ve,
0:19:58 I think we’ve talked a little bit in the past about like,
0:20:01 AI could help, you know,
0:20:03 ’cause right now there’s huge wealth disparity
0:20:04 all around the world, right?
0:20:07 And in theory, AI could help with that
0:20:10 by giving everyone more opportunities and things like that.
0:20:12 But also we could, in a particular situation
0:20:14 where only some people can afford the AI
0:20:16 and everybody who can afford it and use it well,
0:20:19 they get way ahead of everyone else, right?
0:20:22 ‘Cause like, yeah, if you can pay $10,000 a month
0:20:24 to use the newest open AI model
0:20:27 and it’s basically like hiring 20 employees or something,
0:20:28 nothing that’s gonna happen now,
0:20:30 but like, let’s say in three years or something,
0:20:31 that could really change your life.
0:20:34 Like, and people who can’t afford that,
0:20:35 it’s not gonna be great for them.
0:20:37 So I do wonder about that.
0:20:40 Like I think people are not thinking about that yet about,
0:20:42 yeah, these things cost a lot right now,
0:20:46 but they may cost a lot more in the future.
0:20:48 – I can see it going either direction honestly,
0:20:51 because I do know like they’re going to continually work
0:20:54 to get the cost of compute down as well, right?
0:20:57 I think as it gets more and more intelligent,
0:20:59 it’s going to require more compute power,
0:21:01 but at the same time,
0:21:03 everybody’s trying to lower the cost of compute, right?
0:21:06 Like NVIDIA is trying to make their chips
0:21:09 more affordable to build and more efficient
0:21:10 and all that sort of stuff.
0:21:12 – And Microsoft and others also,
0:21:14 I think they’re starting to invest into nuclear as well.
0:21:17 So like, we’re gonna need more energy as well
0:21:18 and like cleaner energy.
0:21:20 So I think that’s gonna be one of the good things,
0:21:21 is like, yeah, we’re gonna have probably cleaner energy
0:21:23 all around and more energy.
0:21:25 – So that’s a good thing.
0:21:26 Yeah, I’m not trying to make it negative,
0:21:28 but that is kind of like what’s been
0:21:29 the back of my mind recently.
0:21:33 Like, oh yeah, maybe these really will require like,
0:21:35 you may have to pay like a thousand to $10,000 a month
0:21:37 for like the best models.
0:21:39 – Yeah, I don’t know.
0:21:41 I mean, personally, I think AGI is probably
0:21:43 a little bit farther out than like three years,
0:21:46 but I don’t know, maybe I’m being pessimistic.
0:21:51 I think with like the tool like sort of overload concept,
0:21:53 the saturation of just too many tools,
0:21:54 too many monthly payments.
0:21:56 I do think that’s an issue right now,
0:22:00 but I also do think there will be like a consolidation,
0:22:01 like you mentioned.
0:22:03 I think a lot of people are gonna be using the Googles,
0:22:06 the Microsofts, the, you know, maybe OpenAI,
0:22:09 maybe Anthropic, maybe some of these other companies
0:22:11 all just have like a single platform
0:22:13 where you can generate your images,
0:22:16 generate your videos, generate your text,
0:22:18 you know, have this sort of agentic features
0:22:21 and you pay the one service and it could kind of do it all.
0:22:24 I think that’s eventually where it’s going to get
0:22:27 in the near term, but right now I do think it’s a problem.
0:22:30 I think, I also think there’s a lot of people out there
0:22:32 that are just absolutely delusional
0:22:35 with like their product ideas.
0:22:37 I think so many people are going out there and going,
0:22:41 hey, look, I just created an AI tool
0:22:43 that can write children’s books for you.
0:22:45 It’s 50 bucks a month, right?
0:22:47 But then like somebody will go use it once
0:22:51 and be like, cool, I made an okay children’s book
0:22:54 that’s nothing special.
0:22:56 Why am I gonna keep paying monthly for that, right?
0:22:59 Like I see all the time on the Future Tools website,
0:23:02 people send to me like AI tattoo generator
0:23:03 for 10 bucks a month.
0:23:05 Who wants to pay 10 bucks a month
0:23:06 for an AI tattoo generator?
0:23:08 At the very least, I’m gonna pay once,
0:23:11 get my tattoo idea generated, go get my tattoo
0:23:12 and then I don’t need you anymore.
0:23:14 Like I think there’s so many people
0:23:15 that are just delusional with the products
0:23:16 they put out there thinking
0:23:19 that there’s an actual business model behind them
0:23:20 but really it’s just like a,
0:23:22 this is a cool feature that I’m trying to make money off
0:23:25 of quick while AI is hot and in the news
0:23:27 but it’s not a good product.
0:23:29 – Yeah, I mean, in the non-AI world right now,
0:23:30 like the trend is to have stuff
0:23:32 where you like pay for software once
0:23:34 or that’s like kind of a movement happening right now.
0:23:36 You pay for it once and then you own it forever.
0:23:37 – Yeah.
0:23:38 – But with AI, you can’t do that.
0:23:38 So that is the problem, right?
0:23:41 Like these things do cost money to run the models.
0:23:43 It’s not cheap.
0:23:45 And so you kind of have to charge like that.
0:23:47 So there is an issue where a lot of these products
0:23:51 where they’re charging, the value is not there yet.
0:23:53 And I think it’s coming sooner than it sounds like you do
0:23:56 but as of right now, a lot of these products, yeah,
0:23:57 you pay like 25 to 50 bucks a month
0:24:00 and most of them are not worth it really.
0:24:02 – Well, I also think there’s probably a future
0:24:04 and I don’t know how far off I am on this future.
0:24:06 Like I don’t know if it’s within three years,
0:24:07 within 10 years, within 20 years,
0:24:09 but I do think there’s a future
0:24:12 where like most people have like a super computer
0:24:14 in their home, right?
0:24:16 So like an on-device AI
0:24:18 that’s doing all of this stuff for them,
0:24:20 but it’s in their house.
0:24:22 So they don’t actually have to send it off to a data center
0:24:24 or cloud GPUs or whatever, right?
0:24:27 Like I think a lot of, I think that will happen as well.
0:24:32 I think there’s like a big future in on-device inference
0:24:35 for running the AI’s locally, right?
0:24:38 Like if we’re gonna have our own Jarvis, right?
0:24:40 That’s cleaning our house and doing our dishes
0:24:43 and doing our laundry and vacuuming our floors
0:24:48 and, you know, does everything for us on a daily basis.
0:24:53 I have a really hard time seeing that all be in the cloud,
0:24:53 right?
0:24:57 What happens if that cloud service goes down for the day?
0:24:59 All right, everything I do with my life,
0:25:02 I can’t do today because that service is down.
0:25:05 What happens if like there’s internet outages?
0:25:09 Okay, now I can’t actually run all of the stuff
0:25:11 that I’ve been running in my life
0:25:14 because I can’t contact Google services, you know,
0:25:15 servers today, right?
0:25:18 So I do think that there’s probably also a future
0:25:20 where it might only be for the wealthy.
0:25:22 I don’t know, but I do think people are gonna have
0:25:24 like super computers in their home
0:25:28 that can like run these massive AI systems at some point.
0:25:29 – Yeah, I was thinking about that.
0:25:31 You’re talking about Jarvis, but like, yeah, Tony Stark
0:25:32 is supposed to be a billionaire, right?
0:25:33 In the stories.
0:25:35 – Yeah, exactly.
0:25:37 – So maybe, you know, it is a billionaires
0:25:38 who have the local models.
0:25:41 ‘Cause I mean, in the future that could happen,
0:25:43 but it feels like for a long time,
0:25:44 you’re gonna need a lot of compute
0:25:45 to make these models really useful.
0:25:47 And so obviously the ones in the cloud
0:25:48 where they’re benefiting from the large scale
0:25:51 of all the servers being one place and remotely,
0:25:52 that’s gonna always be way better
0:25:53 for at least for a while.
0:25:55 – Yeah, well, I mean, I think the training
0:25:56 can happen remotely,
0:25:59 but the inference would happen locally, right?
0:26:02 So, you know, when they actually like train and open,
0:26:04 I know you know this, I’m just more saying this
0:26:05 for like the audience, right?
0:26:07 But like when they train these big models,
0:26:10 like the newest version of like Claude,
0:26:13 the newest version of chat, GPT, things like that,
0:26:16 a lot of times it takes like millions of dollars
0:26:18 and months and months and months and months
0:26:20 to train a new model, right?
0:26:23 I don’t think we’re very close to doing that at home,
0:26:24 but the inference part,
0:26:26 the part where we ask it the question,
0:26:30 it sort of queries the database and then responds to you,
0:26:32 that’s a lot less compute intense.
0:26:33 So I think,
0:26:35 – Well, I’m not sure.
0:26:38 So like based on what OpenAI is saying about the 01 model,
0:26:40 it seems like it’s gonna be more and more intensive
0:26:41 on the inference side.
0:26:43 So if that is how they start scaling things up,
0:26:45 and if let’s say the anthropic,
0:26:47 I assume maybe anthropics going the same route,
0:26:49 maybe they’re like a few months behind OpenAI or whatever,
0:26:52 we’re probably gonna see all these guys like go into the,
0:26:53 you know, we’re basically,
0:26:54 inference is gonna be more and more important.
0:26:56 Like, and so I don’t know, I’m not sure.
0:26:59 Like I think inference is gonna be one of the ways
0:26:59 that this all scales,
0:27:02 ’cause like the logic side is what’s been missing.
0:27:04 And so it seems like that’s where they’re all gonna be
0:27:05 focusing on.
0:27:06 – Yeah, yeah.
0:27:09 But I mean, I still think that the inference side,
0:27:14 it’s going to, like the stuff that OpenAI is talking about
0:27:16 is more for like the really sort of complex
0:27:19 mathematical stuff in the current state.
0:27:22 I don’t really think,
0:27:23 I don’t know how to word this,
0:27:25 but I don’t really think that like,
0:27:28 making it take a lot longer on the inference side
0:27:29 is what people are gonna want.
0:27:31 So I don’t, like I think they’re gonna have to figure out
0:27:34 how to like really, really shrink that time down.
0:27:35 So when you give it a task,
0:27:38 you get the response quickly, right?
0:27:40 I think slowing it down is cool
0:27:43 if I need to like analyze a complex math problem
0:27:46 or write some code for me or something like that.
0:27:50 But if I need to just like get a response really quickly
0:27:53 on the best ice cream shop near me,
0:27:55 I don’t want it to like analyze for 15 minutes
0:27:56 before it gives me a response.
0:27:58 – Yeah, yeah, but that’s where you would scale up compute.
0:28:01 So if you scale up compute, you could do that faster.
0:28:03 Like, so I think those systems, like they’ll,
0:28:05 they’ll want that to be fast.
0:28:07 They know that that’s not usable for regular people,
0:28:10 like waiting, you know, 20 seconds or a minute or whatever,
0:28:11 like people are gonna expect
0:28:13 these things are happening almost instantly.
0:28:14 – Yeah, yeah, yeah.
0:28:15 We’re getting very theoretical here.
0:28:17 Let’s see.
0:28:19 All right, so RumblePak News says,
0:28:21 how will AI change content creation?
0:28:25 Like being a YouTuber, will AI replace YouTubers?
0:28:30 I personally think that as AI gets more and more prolific
0:28:34 and more and more people use AI and turn to AI
0:28:36 to get responses to their questions
0:28:37 and to learn about things,
0:28:41 I also think simultaneously being a real human
0:28:44 that people actually like relate to
0:28:47 and know is a real human with real human thoughts
0:28:50 is also going to become more important at the same time.
0:28:52 Right, because I think the lines are gonna get blurred
0:28:55 really, really quickly between what content
0:28:58 was actually generated by AI and what content wasn’t.
0:29:00 I mean, when it comes to like written content,
0:29:01 those lines are already blurred.
0:29:04 It’s already like nearly impossible to tell
0:29:06 whether an article was completely written by AI
0:29:08 or completely written by a human
0:29:10 or some sort of hybrid of the two, right?
0:29:13 And I think you’re gonna see that happen more and more
0:29:16 with video and audio and things like that as well.
0:29:19 And as those lines blur, I think people like us
0:29:21 who are actually putting our face out there,
0:29:24 our voice out there, our opinions, our predictions,
0:29:25 those kinds of things,
0:29:27 I think there’s going to be value in that
0:29:30 in a world where people are having a hard time
0:29:33 telling the difference between reality and not reality.
0:29:35 You know, we’ve already seen this a little bit
0:29:36 like with like VTubers, you know,
0:29:39 where they have these like virtual avatars
0:29:41 that then they talk and they’re like cute
0:29:43 and people watch them on Twitch and things like that.
0:29:45 And that’s kind of like a niche where I feel like
0:29:46 probably AI videos may be similar
0:29:48 where people are gonna think that’s cool
0:29:49 and some people are gonna be into it
0:29:51 and you’ll see some really crazy stuff with it
0:29:54 ’cause you can basically take VTubers to the next level
0:29:56 where like there’s all this interactive stuff going on.
0:29:57 That’ll be fun.
0:29:59 But I think I’ll just be like a niche.
0:30:00 But I think like you said,
0:30:01 more and more people are gonna want
0:30:03 some kind of human connection
0:30:04 and to feel like they have a relationship
0:30:05 with this person that they’re watching
0:30:06 and that they’re learning from
0:30:08 or that they’re enjoying their content.
0:30:10 So I don’t think, if anything,
0:30:13 I think actually maybe YouTubers and having a personality
0:30:14 and anything you do in business
0:30:16 is gonna be more and more important in the future.
0:30:18 I mean, ’cause like, I do believe we’re heading
0:30:21 to the point where you can spin up a company
0:30:23 and maybe you do pay the $10,000 AI model
0:30:24 and you’re saying,
0:30:27 hey, I’m gonna copy so-and-so’s company
0:30:29 and I’m gonna throw these resources at it
0:30:30 and try to beat them up.
0:30:32 You’re gonna see tons of this.
0:30:34 Business is gonna get more and more cutthroat.
0:30:35 And so because of that,
0:30:36 having some kind of personality
0:30:38 that people actually care about,
0:30:39 that’s where you could have some more loyalty
0:30:41 that people are like, oh, I like Matt,
0:30:44 I like Nathan, I like HubSpot, whoever, right?
0:30:46 I think that can be more and more important over time.
0:30:46 So if anything else,
0:30:48 if anything, I would be doubling down
0:30:50 on making sure you have a personality
0:30:51 in the work that you do.
0:30:52 – Yeah, yeah.
0:30:54 And I think the types of channels
0:30:57 that might struggle are more of like the faceless channels
0:30:59 that are creating like informational content
0:31:01 that just have a voiceover and nothing else.
0:31:03 I think that kind of content
0:31:05 is probably gonna become more and more of a struggle,
0:31:09 mostly because I think it’s going to get way over saturated.
0:31:11 As AI, as, you know, we’re gonna get to a point
0:31:13 where you can just say generate me a video
0:31:15 on how quantum computing works.
0:31:17 And it’s gonna spit out a 15 minute video
0:31:19 with a voiceover, with background music,
0:31:21 with sound effects, with B-roll.
0:31:25 And that video is going to explain, you know,
0:31:27 how quantum computing works,
0:31:29 and you’re gonna be able to put that on YouTube.
0:31:31 And if you’re like the first few people to do it,
0:31:33 you’ll probably do pretty well.
0:31:36 But over time, A, I can generate that myself.
0:31:37 I don’t need to go to YouTube
0:31:40 and find somebody to generate that for me
0:31:41 and then publish it to YouTube.
0:31:45 But B, you’re gonna see YouTube just get so over saturated
0:31:47 with that kind of content.
0:31:49 Content that was just like somebody entered a prompt,
0:31:52 put the output, uploaded to YouTube, right?
0:31:55 There’s going to be a phase where that is happening a lot.
0:31:57 Like, I’m predicting that right now.
0:32:00 Give it like a year and a half.
0:32:02 We’re gonna see so much just trash coming.
0:32:04 It’ll be good value content,
0:32:06 but it’ll be so low effort
0:32:09 that there’s gonna be so much of it.
0:32:12 And that, I think, is what worries me about YouTube.
0:32:15 But I also feel that’s where the potential is
0:32:18 if you wanna be a YouTuber because being that real person
0:32:22 that people can see and relate to becomes more valuable.
0:32:24 There’ll be a window where I don’t think people will realize
0:32:26 that it’s AI generated, right?
0:32:28 – Yeah, like on Facebook, you see it right now.
0:32:29 – On Facebook, yeah.
0:32:31 Facebook’s the exact example I was thinking of.
0:32:33 – Yeah, yeah, on Facebook, you see the older people,
0:32:35 the boomers or whatever, where they’re like,
0:32:36 oh my God, that’s so cute.
0:32:39 Like people are posting like obviously fake photos
0:32:40 of whatever.
0:32:43 I saw it was like a cat snowman.
0:32:44 It was like a snowman.
0:32:46 It looked like a cat.
0:32:47 It was a snowman.
0:32:49 And they were like, oh, you’re amazing.
0:32:51 I can’t believe how talented you are.
0:32:52 I was like, oh my God.
0:32:58 – So CyberGo says, do you feel that there’s a disconnect
0:33:02 between creators in the AI field and the general public?
0:33:05 Are we, as creators, sometimes too disconnected
0:33:07 from the ethical or social issues
0:33:09 that this technology brings?
0:33:11 And the reason I like this question
0:33:14 is ’cause personally, I don’t feel disconnected
0:33:15 from those concerns at all.
0:33:18 Like I sort of live in both bubbles.
0:33:21 Like I live in the AI creator bubble
0:33:23 of everybody making the videos and the images
0:33:26 and using all the large language models
0:33:27 to create cool content.
0:33:31 But I also follow a ton of people that are against AI.
0:33:35 Not the people that are responding to my tweets
0:33:38 with like screw you, AI sucks, right?
0:33:40 But the people that are actually out there
0:33:45 like bringing up really good arguments against AI, right?
0:33:47 There are people out there that are doing it in a way
0:33:49 where they’re not just saying, oh, you like AI?
0:33:51 Well, screw you, I hate you,
0:33:54 and I’m just gonna like cuss at you every time you tweet.
0:33:55 There are people out there that are like,
0:33:58 I see your points, here’s my counter points, right?
0:34:01 And they actually want to healthily debate people
0:34:03 who agree with AI.
0:34:05 And I follow a lot of those types of people as well.
0:34:09 So like in a lot of my content that I put on YouTube,
0:34:13 I like to look at it from both sides of the coin, right?
0:34:15 Like when I’m talking about a new AI video model,
0:34:17 I will always kind of talk about,
0:34:19 this is why I think this is really, really cool,
0:34:23 but also here’s some of the issues I see with this as well,
0:34:24 right?
0:34:27 Like when it comes to tools like mid-journey
0:34:29 and stable diffusion and some of those kinds of things,
0:34:32 I think it’s really, really, really cool technology.
0:34:35 I think it generates some amazing images
0:34:38 and it opens up the floodgates for anybody to be creative
0:34:42 and to bring into this world anything that they can imagine.
0:34:45 And I think that’s super amazing, super empowering.
0:34:49 But at the same time, I also think, yeah, but they scraped,
0:34:51 you know, millions of other people’s images
0:34:52 to train this data set.
0:34:54 And those people didn’t get compensated.
0:34:56 And I can go to some of these tools and say,
0:34:59 generate an image that looks like a Banksy.
0:35:01 I don’t know, I couldn’t think of a better artist
0:35:03 in the moment, but you know, generate an image
0:35:04 that looks like this artist.
0:35:07 And it generates an image that looks just like the work
0:35:08 that that artist would have created.
0:35:13 And I do think that like the ethics of that sort of bother me.
0:35:17 Right? So I’m constantly trying to look at things
0:35:19 from both angles.
0:35:22 I tend to live on the side of technology
0:35:26 is going to progress, whether you people over here
0:35:27 like it or not.
0:35:31 So I’m going to learn about it and I’m going to live with it
0:35:33 and I’m going to use it and I’m going to try to implement it
0:35:38 in my various workflows, but that doesn’t mean
0:35:40 I’m not sensitive to the implications
0:35:42 on the other side of the coin with it.
0:35:47 I just don’t think that there’s a sort of, you know,
0:35:48 rewind button.
0:35:51 I don’t think there’s any going back on this now.
0:35:56 And so like, if I was in that position of,
0:35:58 I don’t like this, so I’m not going to use it.
0:36:01 I feel like I’m putting myself in this like helpless position
0:36:06 of like, I’m going to try to stop this from ever happening,
0:36:10 but the strength that I have is not enough
0:36:11 to stop this moving train.
0:36:13 It’s just going to plow right through me.
0:36:16 So I tend to operate on the other side of the train.
0:36:18 I’m going to ride the train instead of standing on the track
0:36:21 trying to put my hands in front of it and stop it, you know?
0:36:23 – I mean, I think we feel pretty similarly about it.
0:36:28 I mean, I kind of consider myself like a, you know,
0:36:30 effective accelerationist, you know,
0:36:32 definitely a techno optimist,
0:36:34 but sometimes those people get a little too crazy,
0:36:35 like accelerate everything,
0:36:38 don’t care about any of the consequences.
0:36:41 I definitely, I don’t really feel like I’m in that camp.
0:36:43 But I do, I mean, I personally feel that AI
0:36:46 is going to make the world way better.
0:36:47 But the same, and I think a lot of people
0:36:49 have actually not properly thought that through
0:36:53 like what AI will enable for us in the future
0:36:56 in terms of robotics, in terms of scientific breakthroughs,
0:36:58 all kinds of different parts of life
0:37:02 that I think AI is going to make better for us.
0:37:03 But there are going to be some major issues.
0:37:05 Like the main thing I think about
0:37:07 is not really the AI art stuff.
0:37:08 Like I understand that argument too,
0:37:11 but I don’t find as much interest and think about that.
0:37:13 I think more about the job displacement,
0:37:15 I think really is the thing that I think more about.
0:37:17 You know, actually I did a speech,
0:37:18 kind of talking about something similar to this
0:37:21 at Stanford several years back,
0:37:22 talking about, you know,
0:37:25 how I grew up in a small town in Alabama.
0:37:27 And yeah, I love technology
0:37:28 and technology usually makes things better.
0:37:30 But in my small town in Alabama,
0:37:32 like as soon as like all the factories were gone,
0:37:34 went to other places, you know,
0:37:35 life changed dramatically.
0:37:37 And this happens throughout history.
0:37:40 And it’s like the adjustment is hard for people.
0:37:43 And it does have a material impact on people’s lives.
0:37:44 So I do feel a lot of sympathy for that.
0:37:47 And I do think major changes are happening, you know,
0:37:49 in my newsletter, like on Lord.com,
0:37:50 that’s what I try to talk about a lot is like,
0:37:52 how can I help people thrive in the age of AI?
0:37:54 ‘Cause that’s all you can do.
0:37:56 Like be as positive about this as you can
0:37:58 and try to figure out how you’re going to, you know,
0:38:00 ride this wave because there’s no stopping it.
0:38:02 Like there’s no way it doesn’t matter
0:38:03 if you’re on the left or right, whatever.
0:38:06 No politicians, if they’re smart, are going to be against this
0:38:09 because it is what’s going to help America be a leader
0:38:10 in the future economy, you know,
0:38:13 just like how we were the winners in the internet.
0:38:15 And that enabled our entire economy
0:38:17 to continue growing like it has.
0:38:19 And the same with entertainment and weapons
0:38:22 and other things in the past, computers, everything else.
0:38:23 It’s the same thing with AI.
0:38:24 Like we have to win at this.
0:38:26 So there’s no, you know,
0:38:27 no matter what anyone feels about it,
0:38:29 there’s not going to be stopping AI.
0:38:31 It’s going to continue getting better.
0:38:34 And so, yeah, so I do think a lot about job displacement.
0:38:36 And I don’t think we have great answers to that.
0:38:38 Like there’s going to be a lot of new opportunities,
0:38:41 but there are going to be major job losses, like major, so.
0:38:46 – Yeah, I tend to be optimistic about the abilities of humans
0:38:51 and their ingenuity and their ability to figure out,
0:38:53 you know, what value they can bring next.
0:38:56 You know, maybe the value isn’t being the guy
0:38:59 that enters data into a spreadsheet
0:39:00 and files your taxes for you.
0:39:03 Maybe the value isn’t the guy that, you know,
0:39:07 goes through all of this previous case law for you
0:39:10 and, you know, helps you fight an argument in court.
0:39:12 Maybe some of that kind of stuff goes away,
0:39:15 but I do think humans will find new ways
0:39:17 to add value to the world.
0:39:22 I mean, the people who sold ice blocks hated it
0:39:24 when the refrigerator came out, right?
0:39:29 The people who were painters hated it when cameras came out.
0:39:33 The people who were hardcore enthusiasts or photographers
0:39:34 hated it when Photoshop came out.
0:39:36 Anybody can manipulate photos.
0:39:39 Like this story is a tale as old as time,
0:39:42 but humans have always continued to figure out how to thrive
0:39:45 and figure out how to like move to the next thing.
0:39:49 And they’ve always figured out how to create new value
0:39:51 in the world when the old way they created value
0:39:54 ceased to be a way to create value.
0:39:56 And I think it’s going to continue that way.
0:39:59 And I choose to be optimistic about that
0:40:03 because I don’t see the value in being pessimistic
0:40:05 and feeling like, oh no, the sky is falling.
0:40:06 We’re all doomed.
0:40:09 Like you can choose which side you want to think,
0:40:11 which side you want to put your brain energy towards.
0:40:13 And I’m going to put it towards the optimistic side
0:40:15 because the other side just feels like hell to me.
0:40:17 – Right, right.
0:40:20 And I do feel like the AI may actually help here too, right?
0:40:22 People are not realizing it.
0:40:24 But like, imagine like, yeah, in the future,
0:40:25 and someone’s going to be like, oh my God,
0:40:26 yeah, AI is going to help.
0:40:28 It took my job and then it’s going to help me.
0:40:30 But there probably will be scenarios where like people
0:40:33 lose their jobs and have to rethink their lives.
0:40:35 And then they’re like, and then AI is way better than them.
0:40:37 They’re like literally chatting with AI.
0:40:40 And the AI like, okay, you’re good at this kind of stuff.
0:40:40 You’re not good at this.
0:40:42 It kind of like actually learns about
0:40:43 what you’re actually good at
0:40:44 ’cause everyone has different kinds
0:40:45 of things they’re good at, right?
0:40:47 And it actually learns what you’re good at
0:40:49 and it helps you put together a plan
0:40:50 of what you should do next.
0:40:51 I think that kind of stuff is going to happen
0:40:54 where people start entirely new careers,
0:40:56 new side businesses, whatever,
0:40:58 just because the AI helped coach them to do it.
0:41:00 And then probably had AI agents
0:41:03 to help them even execute on some of the work, right?
0:41:04 So I think you’re going to see a lot of that
0:41:07 where people before who could not do a business,
0:41:09 maybe because all the legal stuff was annoying
0:41:11 or accounting or whatever,
0:41:13 stuff that they were not interested in,
0:41:15 now AI is going to help them do that.
0:41:16 So I think you’re going to see
0:41:18 a lot of new opportunities for people as well.
0:41:21 If you lose your job, here’s the game plan.
0:41:22 – Just talk to ChadGPT.
0:41:23 – No, here’s the game plan.
0:41:27 If you lose your job, take your entire life savings,
0:41:28 take out a second on your house
0:41:30 or use credit cards or whatever.
0:41:35 Go buy as many cyber robo taxis as you can from Tesla
0:41:37 and then just rent them all out
0:41:40 and then you get income from these taxis
0:41:41 just driving people around.
0:41:42 There you go.
0:41:43 – Yeah.
0:41:45 – The value that people will add to the world
0:41:47 may just be the ownership.
0:41:50 – I am joking, but that is one of the ways.
0:41:51 – I thought you were going to get in video stock.
0:41:52 I thought you were going to get in video stock.
0:41:54 – Yeah, just invest in video right now.
0:41:57 – Sell everything you’ve got.
0:41:58 – Joking.
0:42:00 – Quite honestly, joking aside,
0:42:03 I do think people like Elon, love him or hate him,
0:42:05 I know a lot of people hate him.
0:42:06 Some people are probably going to say
0:42:08 I’m never going to tune into this podcast again
0:42:09 because Matt mentioned to Elon.
0:42:11 I’ve gotten those kind of comments on my YouTube videos,
0:42:13 but love him or hate him,
0:42:16 he is trying to create additional revenue streams
0:42:18 that don’t require labor.
0:42:20 Like if you look at what they’re doing with the RoboCab,
0:42:21 he’s basically saying,
0:42:24 and I think his numbers and his timelines are way off
0:42:26 ’cause they almost always are with Elon,
0:42:27 but he’s basically saying,
0:42:31 you can go buy one of these RoboCabs for $30,000.
0:42:33 It will take you around wherever you need it
0:42:36 to take you around and when you’re not using it,
0:42:38 it will autonomously go and tax the other people around
0:42:41 and you get some of the revenue from that, right?
0:42:44 So like, I think maybe in the future,
0:42:47 like that’s the kind of revenue streams
0:42:48 people are going to be generating.
0:42:50 Now, probably not the best example
0:42:51 because you’ve already got,
0:42:52 you’ve got to be able to afford one of those
0:42:55 to sort of get your feet in the door on something like that.
0:42:58 But I do think like new revenue generation models
0:43:01 are going to pop up, like ignore the RoboCab thing.
0:43:03 I think new revenue generation models
0:43:05 that people haven’t even thought of yet
0:43:10 are going to pop up and replace some of the more laborious
0:43:13 tasks, right? Whether it be, you know,
0:43:17 the blue-collar mining will be done by robots
0:43:21 or the, you know, Excel spreadsheet accountant type stuff
0:43:24 is going to be done by, you know, AIs, right?
0:43:26 I think a lot of that stuff is going to get replaced,
0:43:28 but new stuff will bubble up
0:43:31 that is going to allow you to provide value to the world.
0:43:32 – Yeah, and on the other side too,
0:43:33 a lot of the stuff he’s building
0:43:37 will result in abundance and the entire idea of like,
0:43:38 our economy is driven like based on like,
0:43:40 abundant scarcity, like what’s, you know,
0:43:45 supply and demand, like if it’s cheaper to produce things,
0:43:46 the cost will go down.
0:43:49 And so over time, the cost of living will actually go down
0:43:50 because of these inventions.
0:43:52 And people are not realizing that, like,
0:43:54 when you have robots out there building all this stuff for us
0:43:56 and it’s cheap to make them,
0:43:57 the cost of everything is going to go down.
0:43:59 – Awesome. Well, I do think that’s probably a good place
0:44:01 to wrap this one up.
0:44:03 I think, you know, we went off on some tangents
0:44:06 and some rants and went sort of deep and theoretical
0:44:07 with a lot of these questions.
0:44:09 So we actually only got through maybe like,
0:44:11 25% of the questions that were asked,
0:44:12 but we are going to save this thread.
0:44:14 We are going to do more of these like,
0:44:17 ask us anything kind of episodes in the future.
0:44:18 They’re a lot of fun.
0:44:22 We love just sort of riffing on whatever anybody wants us
0:44:23 to talk about.
0:44:25 We like, for me, I know that’s like one of my sweet spots.
0:44:28 I love not knowing where the conversation’s going to go.
0:44:29 So that’s a lot of fun.
0:44:31 We are going to save any of the questions that we missed
0:44:33 and circle back around to some of our favorites
0:44:35 in a future episode.
0:44:36 So thank you so much to everybody
0:44:39 who did ask your questions over on X,
0:44:41 over on LinkedIn, over on threads.
0:44:42 We will be doing more of these.
0:44:44 We really, really appreciate you.
0:44:49 But that being said, I think this is a wrap on this episode.
0:44:51 So thank you so much for tuning into this one.
0:44:55 If you enjoyed this episode, make sure you subscribe to us.
0:44:59 Either subscribe over on YouTube if you want all the visuals
0:45:01 and you want to look at me and Nathan’s beautiful faces
0:45:03 as we talk about this stuff.
0:45:06 If you really, really don’t like looking at our faces,
0:45:09 go subscribe wherever you subscribe to podcasts.
0:45:11 We’re on Spotify, Apple, all the places.
0:45:14 Come tune in and subscribe wherever you listen to podcasts.
0:45:16 And thanks again for tuning in.
0:45:17 Thank you.
0:45:20 (upbeat music)
0:45:22 (upbeat music)
0:45:25 (upbeat music)
0:45:28 (upbeat music)
0:45:30 (upbeat music)
0:45:33 (dramatic music)
0:45:35 you

Episode 30: How are AI tools really transforming our productivity? Matt Wolfe (https://x.com/mreflow) and Nathan Lands (https://x.com/NathanLands)  dive deep into the world of AI-driven workflows in their latest Q&A Special episode. No guest joins this episode, ensuring that our beloved hosts can thoroughly dissect the impact of these smart tools on their workflow.

In this episode, Matt and Nathan discuss several AI tools, addressing market saturation, the affordability and accessibility of advanced models, and intriguing business opportunities. Matt shares how Spotter helps generate video thumbnail concepts, and both hosts discuss the future of content creation, AI’s ethical concerns, and technological advancements on the horizon. Expect insights into their favorite AI tools, the evolving AI market, and the balance between automation and human authenticity.

Check out The Next Wave YouTube Channel if you want to see Matt and Nathan on screen: https://lnk.to/thenextwavepd

Show Notes:

  • (00:00) New o1 model shows potential with agents.
  • (04:13) Email management agents automate responses and organization.
  • (09:09) Perplexity, Claude, NotebookLM aid research efficiently.
  • (11:35) Using various tools for YouTube content optimization.
  • (14:40) Modify AI thumbnail with Photoshop’s generative fill.
  • (18:33) AI could widen or narrow wealth disparities globally.
  • (20:18) AGI is farther out, tool consolidation coming.
  • (25:46) Quick response preferred over complex inference.
  • (28:14) AI videos will evolve VTubers but remain niche.
  • (32:38) Engages both support and critique of AI.
  • (35:01) AI improves world, poses job-related challenges.
  • (37:22) Humans adapt and create new value continually.
  • (40:38) Elon aims for autonomous robo-taxi revenue.
  • (43:05) Enjoying spontaneous conversations; future questions appreciated.

Mentions:

Check Out Matt’s Stuff:

• Future Tools – https://futuretools.beehiiv.com/

• Blog – https://www.mattwolfe.com/

• YouTube- https://www.youtube.com/@mreflow

Check Out Nathan’s Stuff:

The Next Wave is a HubSpot Original Podcast // Brought to you by The HubSpot Podcast Network // Production by Darren Clarke // Editing by Ezra Bakker Trupiano

Leave a Comment