AI Agents Are About to Change Everything (Here’s Why)

AI transcript
0:00:04 The day that we’re recording this has been one of the craziest weeks in the world of AI.
0:00:07 It’s crazy what these tools can do.
0:00:11 I think by this time next year, it’s not going to be about like the single AI agent that you’re using.
0:00:15 It’s going to be like, what is your team of AIs that you manage and you’re the CEO of?
0:00:19 What is your team doing?
0:00:21 Hey, welcome to the Next Way podcast.
0:00:21 I’m Matt Wolf.
0:00:23 I’m here with Nathan Lanz.
0:00:26 And today we’ve got a third co-host.
0:00:29 Today we’re chatting with Don Allen Stevenson, the third.
0:00:31 He worked at Dreamworks.
0:00:36 He’s in tight with Meta, been featured on stage at Meta, works with OpenAI.
0:00:39 Given Ted Talks, he’s an author of the book Make a Seat.
0:00:43 He’s done some amazing stuff in the AI world.
0:00:45 He’s basically an AI wizard.
0:00:51 And today he’s going to chat with us and break down some of his AI wizard spells on this show.
0:00:52 We’re going to talk AI agents.
0:00:54 We’re going to talk AI workflows.
0:00:56 We’re going to break it all down.
0:01:02 Let’s just go ahead and jump right in with Don Allen Stevenson.
0:01:06 When all your marketing team does is put out fires, they burn out.
0:01:10 But with HubSpot, they can achieve their best results without the stress.
0:01:16 Tap into HubSpot’s collection of AI tools, breeze to pinpoint leads, capture attention,
0:01:19 and access all your data in one place.
0:01:22 Keep your marketers cool and your campaign results hotter than ever.
0:01:31 Visit hubspot.com/marketers to learn more.
0:01:34 So really excited to dig in and thanks so much for hanging out with us today.
0:01:39 You’re out in San Francisco at the Masters of Scale summit.
0:01:41 So yeah, this is the first.
0:01:45 This is the first time we’ve chatted with somebody while they’re hanging out outside
0:01:47 of a conference.
0:01:50 First time we’ve had a conversation with somebody hanging out outside.
0:01:54 So kind of a different fun feel to it, but how are you doing?
0:01:56 Oh, I’m doing great.
0:01:57 Yeah.
0:01:58 It’s a really fun atmosphere right now.
0:02:02 They have a lot of people in different tech spaces coming together.
0:02:06 Read Hoffman hosts this every couple of years to kind of bring a lot of minds together and
0:02:07 talk about stuff.
0:02:11 And so, yeah, I’m here to take notes and observe and report stuff online.
0:02:12 Amazing.
0:02:13 Amazing.
0:02:18 Well, we’re going to get into a lot of the sort of current happenings in the AI world.
0:02:23 The day that we’re recording this has been one of the craziest weeks in the world of AI
0:02:29 and thropic released autonomous agent functionality mid journey release some stuff, runway release
0:02:34 some stuff, ideogram, mid journey, all of these companies drop some really, really cool
0:02:35 new features.
0:02:36 We’re going to get into all of that.
0:02:40 But I think before we do, let’s let’s kind of get to know you a little bit.
0:02:44 I know your background is in visual effects at Dreamworks.
0:02:46 Would you mind breaking it down real quick?
0:02:50 You know, how what were you doing before this and how you got to doing what you’re doing
0:02:51 right now?
0:02:52 Yeah.
0:02:57 So before I was doing all the independent content creation stuff that I do now, I was
0:02:59 a teacher, a specialist trainer at Dreamworks.
0:03:02 So I taught all of our software, worked on how to train your dragon and boss baby and
0:03:03 trolls.
0:03:08 And my job was to teach our artists how to leverage the both the proprietary and the
0:03:11 third party software that we used to make the films.
0:03:12 I loved it.
0:03:18 I decided to resign because I wasn’t too thrilled with some of the directions, some of the technology
0:03:19 was going internally.
0:03:25 And I thought it might be easier to innovate and future proof outside of the studio.
0:03:32 I still love my friends and family at Dreamworks, but I found that I was easier to do some innovation
0:03:33 stuff outside of that.
0:03:38 And yeah, now I do a lot of content creation, writing books, use AI, I taught and trained
0:03:44 in AI to interview me and then had that help me write some books, do a masterclass on creativity
0:03:46 and AI and then consulting.
0:03:47 I don’t know.
0:03:48 It just kind of changes every day.
0:03:49 It just depends on who’s who’s asking.
0:03:50 I’ll just change the hat.
0:03:56 So some of some of my favorite stuff that I’ve seen from you on Instagram is when you kind
0:03:58 of get like theoretical, right?
0:04:01 You start using your phone and going, this is what could be, right?
0:04:05 Like you’ll be walking around in a grocery store and maybe you pick up a box of cereal
0:04:09 and point your phone, or looking at it with your glasses or something, right?
0:04:12 And it’ll start pulling up all this information and stuff flashing on the screen.
0:04:17 And from what we’ve seen in the world of AI, it looks like stuff that we might have right
0:04:21 now, but it’s all sort of conceptual and theoretical and you’re playing with it.
0:04:24 I just love those videos that you make on Instagram.
0:04:28 I’m curious, where like, where does, where does that like inspiration come from?
0:04:32 Like where do all these like really cool sort of futuristic ideas that you shoot videos around
0:04:33 come from?
0:04:34 Yeah, believe it or not.
0:04:40 And a lot of that inspiration comes out of a response of the show Black Mirror.
0:04:45 I was watching the show and I, you know, I like it, but at the same time, I’m just noticing
0:04:50 that it’s been giving people very little hope for the future and I’m kind of tired of that.
0:04:56 So I started that series called Clear Mirror to do the opposite of Black Mirror.
0:04:59 And instead of calling it white, I call it Clear Mirror because I want to have a more
0:05:02 transparent relationship with technology.
0:05:07 It’s not just a dark opaque system that you can’t look into or assess or observe.
0:05:11 And so yeah, I really just wanted to have more positive use cases, more positive stories
0:05:12 to tell.
0:05:15 And the, the result is these fun little videos out.
0:05:20 I often shoot them with these glasses and then add the effects onto them afterwards.
0:05:23 Cause I, to your same point, I think we’re going to have a lot of that stuff.
0:05:27 Like now, you know, like it’s not, they’re not that far, they used to be so far in the
0:05:28 future.
0:05:30 People were like, Oh yeah, that’s, that’s pretty normal.
0:05:33 I’m like, wow, damn, I’m zipping by.
0:05:34 Yeah.
0:05:35 Yeah.
0:05:38 Sometimes I’ll just scroll by the video and I’ll just see it on, on mute, right?
0:05:39 Like I’ll just be scrolling Instagram.
0:05:40 I’ll see it on mute.
0:05:43 And at first I’m usually thinking that these days, oh, that’s a real thing.
0:05:44 That’s, that obviously exists.
0:05:48 And then I’ll watch the video and go, Oh no, this is one of those like concepts that he’s
0:05:49 playing with.
0:05:55 But I mean, it feels like the, the, the really getting closer and closer together, right?
0:05:59 Those like theoretical sci-fi things are starting to feel more and more real.
0:06:03 And it’s hard, harder and harder to tell, like, is, is that actually something we can
0:06:04 do?
0:06:08 Because I mean, looking at those Orion glasses that we got to see, you know, a few weeks
0:06:14 ago, a lot of this stuff you were showing off, like those glasses do, like it’s pretty
0:06:15 crazy.
0:06:16 100%.
0:06:17 Yeah.
0:06:21 I mean, like on the Orion’s, like, you know, they got down the form factor that they, they
0:06:22 track well.
0:06:28 And you can have, you can talk to AI characters, you can talk to in-person avatars that are
0:06:31 both either a stylized version or a photorealistic version.
0:06:34 So it’s like, oh, that, that exists now.
0:06:36 It’s not, that’s not sci-fi.
0:06:37 That’s a real thing.
0:06:41 Whether it’s, you know, ready for consumer adoption still, still to come.
0:06:44 But at the same time, it’s not, it’s not fantasy anymore.
0:06:45 Yeah.
0:06:46 Yeah.
0:06:48 So anybody that’s like listening to the audio that’s might not be watching the video, you’re
0:06:51 wearing the Meta Ray-Ban sunglasses.
0:06:53 We’ve talked about them a lot on this, on this show.
0:06:57 I’m, I’m a fan of them, but you mentioned that you managed to like hack them and run
0:06:58 chat GPT.
0:06:59 I’m curious.
0:07:02 Is that like, is that a really hard process to do?
0:07:03 Is that something you could share?
0:07:04 Like, how does that actually work?
0:07:05 Yeah.
0:07:07 So I, full disclosure, I told Meta about this.
0:07:10 So they know that that, and I was like, Hey, what are you going to do about it?
0:07:13 And they’re like, Oh, they actually thought it was kind of cool, but maybe, you know,
0:07:16 they didn’t like, they’re like, why not just use their AI?
0:07:18 But yeah, in general, it is pretty friendly to do.
0:07:20 I can’t code at all.
0:07:24 So how I did it was, I’ve had it for about a year, by the way.
0:07:29 And I got chat, I just asked chat GPT how to do it.
0:07:35 And chat GPT four, not even 4.0, not even the reasoning, not the 0.1, open app, it’s
0:07:38 not, it was the old one, chat GPT four.
0:07:45 And what I asked it was like, Hey, I have an API key to, you know, chat GPT with voice.
0:07:50 And I have the Apple Siri shortcuts built onto my iPhone.
0:07:54 Is there a way that you can give me step by step instructions that would allow me, someone
0:08:01 who cannot code, to understand how to plug and play in Apple Siri shortcuts to get a
0:08:05 thing where if I hit the action button on my Apple watch, or if I hit the action button
0:08:13 on the iPhone 15 or 16 to trigger voice back and forth conversational mode with chat GPT,
0:08:16 but on the Ray-Ban glasses, I would say it’s not too hard to set up.
0:08:19 But maybe it’s a few weird steps.
0:08:21 It’s kind of like built in now.
0:08:24 So I think now you probably don’t even have to do the whole shortcuts thing.
0:08:31 I think chat GPT recently released a widget that does that and I was like, Oh my God.
0:08:35 But before the widget, you could do it with the action button and then, yeah, triggers
0:08:39 the voice mode and the main use case was just more conversational, you know, you can talk
0:08:42 back and forth and have a, and you can interrupt it.
0:08:45 And it was when it had the model that sounded kind of like Scarlett Johansson is when I was
0:08:46 using it.
0:08:47 Right.
0:08:48 I loved having that.
0:08:49 Wow.
0:08:50 Her is real.
0:08:53 I have Samantha on the glasses.
0:08:58 And then they, you know, canceled the sound alike person and then now I lost that voice.
0:09:02 But hey, it was great while I lasted.
0:09:03 Which voice are you using now?
0:09:04 The British sounding one.
0:09:05 I thought was kind of fun.
0:09:07 I don’t know the character’s name.
0:09:08 I went school.
0:09:09 Yeah.
0:09:10 I know what you’re talking about.
0:09:11 That’s awesome.
0:09:12 I appreciate you sharing that with us.
0:09:16 I mean, it’s crazy what these tools can do like Claude and chat GPT where you can just
0:09:22 like, I, I built a whole video game using chat GPT once just going back and forth saying,
0:09:24 um, this is what I’m trying to make.
0:09:27 And then when there was an error, I would just go back and say, I don’t know how to
0:09:28 code.
0:09:29 So what’s this error and how do I fix it?
0:09:30 They would tell me how to fix it.
0:09:36 And then I mean, it, it took me several hours to get to where I want to go.
0:09:40 But still several hours to develop a game is a lot better than the old fashioned way.
0:09:41 100%.
0:09:43 And do you have a coding background?
0:09:44 I don’t.
0:09:47 I mean, I know how, I know how to do HTML and build websites, but that’s about the
0:09:48 extent of it.
0:09:49 Yeah.
0:09:50 I mean, that says everything right there.
0:09:53 It’s like, you can have a game in a couple hours with today’s tools.
0:09:57 That was not a normal thing, even like six months ago.
0:10:01 Well, you know, speaking of, of Claude, they just rolled out a new feature.
0:10:05 And now I just want to kind of get into like all of these new stories that came out.
0:10:08 We can all just sort of like riff on our thoughts and where this is all headed.
0:10:13 But you know, as of this recording, Claude just released a brand new feature called,
0:10:18 I think they call it computer use, which is not my favorite, like naming convention for
0:10:19 it.
0:10:22 But like, Anthropic just released computer use, right?
0:10:26 And I actually went and tested it yesterday and it was pretty cool.
0:10:27 It ran into some issues.
0:10:30 I ran into like rate limit issues where it would go through a whole bunch of steps and
0:10:35 then it would say, oh, you’ve run into a rate limit and it wouldn’t continue for me.
0:10:39 And I ran into some stuff like that, but it was really, really interesting to actually
0:10:45 watch it go and like open up Firefox for you, go move the mouse to the command bar, type
0:10:47 in a search for you.
0:10:51 And then once it searches, basically I gave it the prompt, go to Matt Wolf’s YouTube
0:10:56 channel, find the top five most popular videos and tell me how long they were all published,
0:10:59 how long they were all published and then add them to a spreadsheet for me.
0:11:00 Wow.
0:11:04 And it actually managed to go through every single one of the steps, go to my YouTube,
0:11:09 click on popular, sort by popular, grab the title, copy, paste it into a spreadsheet and
0:11:13 it filled out a spreadsheet of the top five most popular videos.
0:11:17 Now if I’m being honest, I could have done that process myself, you know, four times
0:11:22 as fast, but the implications I think are really, really cool that I could just give
0:11:23 it a command.
0:11:24 This is what I need you to do.
0:11:28 Now go off and do it and I could walk away from my computer and it’ll just go through
0:11:31 all the steps until it completes the thing.
0:11:33 But yeah, pretty crazy.
0:11:35 Have you played with it at all yet?
0:11:36 I actually haven’t.
0:11:37 There’s been so much stuff.
0:11:40 I’ve been reviewing it and watching videos on it.
0:11:43 But I mean, I think by this time next year, it’s not going to be about like the single
0:11:45 AI agent that you’re using.
0:11:49 It’s going to be like, what is your team of AIs that you manage and you’re the CEO of?
0:11:50 What is your team doing?
0:11:55 And you’re going to be like, oh, I’m Matt and my team does like all these things.
0:11:57 You’ll have like all these agentic properties.
0:11:58 I’m excited for that.
0:12:02 It’s going to get weird though because we’ll be like, wow, remember when it was so slow
0:12:05 way back in 2024, things were so slow.
0:12:07 All pre-agent days.
0:12:08 Yeah.
0:12:09 You had to do work on your computer.
0:12:12 You actually had to use the computer to do work.
0:12:13 Yeah.
0:12:17 I know this might seem like an obscure reference, but you know, Wally, did you watch the Wally
0:12:18 movie?
0:12:21 Do you remember the captain in the movie Wally?
0:12:22 Yeah.
0:12:23 Yeah.
0:12:24 Yes.
0:12:25 Yes.
0:12:29 That gentleman, his main issue was that he was like the only person on earth of the
0:12:34 humans that could still read and he had the reading level of maybe like a fifth or a four
0:12:41 or five year old and it just maybe think like maybe down the road in a more negative light.
0:12:46 We might be returning to that where we’re like, we lose some of our, you know, if we
0:12:51 have so many AI agents doing every single task, we might forget some of that basic stuff
0:12:55 like, oh, remember when I have to like, he tries to open up a book by voice commanding
0:12:56 it.
0:12:57 Yeah.
0:12:58 He’s like, open.
0:13:00 And then he’s like, look at it.
0:13:01 That’s a black mirror.
0:13:02 Right.
0:13:03 That’s a black mirror.
0:13:04 Yeah.
0:13:05 Stay on the clear mirror.
0:13:06 You’re right.
0:13:07 Sorry.
0:13:08 My apologies.
0:13:09 Yeah.
0:13:10 Yeah.
0:13:11 I mean, I feel like this technology too, though, could like be teaching everyone to
0:13:12 read better and things like that.
0:13:16 Like, you know, making less time on the computer and more time in the real world, the physical
0:13:17 world as well.
0:13:18 So.
0:13:19 Yeah.
0:13:20 I feel like there’s like almost two potential paths, right?
0:13:25 There’s going to be people who use AI and they get way lazier as a result, right?
0:13:27 They’re like, oh, this does all my work for me.
0:13:29 I’m just going to let it do everything.
0:13:32 I’m going to go and smoke weed and play video games, right?
0:13:37 Like there’s going to be that sort of sector of people, but then the way, like ever since
0:13:43 AI has sort of bubbled up over the last three years, for me, it’s made me go way deeper
0:13:44 on stuff, right?
0:13:46 Like it’s made me go, oh, this is really interesting.
0:13:47 I want to dive deeper.
0:13:52 I’m going to go to perplexity and have perplexity, dig into all these resources for me and do
0:13:54 some research and I’m going to learn more about it.
0:13:58 I’m going to go to archive.org and some of these like white papers that these people
0:14:03 put out that I tried to read in the past, but like once it starts putting like letters
0:14:07 into the algorithms and different symbols that I don’t even recognize, these papers
0:14:11 are over my head, but I can now pull them into notebook LM and have notebook LM give
0:14:13 me a podcast that explains it to me.
0:14:14 Perfect.
0:14:17 So I think I feel like there’s those two potential paths that I’m going to get way
0:14:25 lazier as a result, or I’m going to use this to really up level, you know.
0:14:28 We’ll be right back, but first I want to tell you about another great podcast you’re
0:14:29 going to want to listen to.
0:14:34 It’s called Science of Scaling hosted by Mark Roberge, and it’s brought to you by the
0:14:39 HubSpot Podcast Network, the audio destination for business professionals.
0:14:44 Each week hosts Mark Roberge, founding chief revenue officer at HubSpot, senior lecturer
0:14:49 at Harvard Business School, and co-founder of Stage 2 Capital, sits down with the most
0:14:54 successful sales leaders in tech to learn the secrets, strategies, and tactics to scaling
0:14:56 your company’s growth.
0:15:01 He recently did a great episode called How Do You Solve for a Siloed Marketing and Sales,
0:15:03 and I personally learned a lot from it.
0:15:05 You’re going to want to check out the podcast.
0:15:09 Listen to Science of Scaling wherever you get your podcasts.
0:15:15 Thank you for that kind of remind me of the queer mirror.
0:15:16 You’re right.
0:15:19 Like, I will not go down that route here.
0:15:20 It’s very positive.
0:15:24 You know, you can have, like, I would love to have an AI agent that I build, like one
0:15:29 the person I’m going to try to do with computer use, is get a little AI to be my professional
0:15:34 critic and email me how it feels about my content.
0:15:38 I’ve been training a critic based off of all the best critical feedback I’ve gotten
0:15:39 online over the years.
0:15:43 I’ve been collecting it anyway on Google Doc, and they’re all feedback, so it’s not
0:15:48 like haters, but like comments that hurt me because they were right.
0:15:49 Yeah.
0:15:50 Yeah.
0:15:52 And I was like, oh, dang.
0:15:57 So, yeah, like, I’m going to basically, I would love to have Claude use periodically
0:16:03 check my content and then inform me on things like, “Hey, you’re not this about this.”
0:16:07 You know, just get like that nice formal critique where it’s like a thoughtful, actionable thing.
0:16:08 Yeah.
0:16:12 When it comes to these AI agents, you know, I want to go back to something you said where
0:16:17 you’re almost like the CEO and you have like a bunch of agents underneath you, and I love
0:16:20 that idea of like from a YouTuber perspective, right?
0:16:24 Like I would love to be able to use one of these agents and say, all right, here’s the
0:16:27 transcript from a video I’m about to publish.
0:16:32 Take this transcript and, you know, write a title for me, but to write a title, go do
0:16:36 some research on YouTube and find out what style of title is working really, really well
0:16:39 right now based on what you find.
0:16:41 Give me 10 potential titles.
0:16:46 That’s AI agent number one, AI agent number two, I need a good thumbnail for this video.
0:16:50 Go look on YouTube, find the thumbnails that are performing the best, take some screenshots
0:16:55 of them and analyze what works really well for thumbnails right now.
0:16:59 Come back, give me some ideas for a thumbnail, give me like 10 ideas.
0:17:00 I’ll pick from those 10.
0:17:04 And then once I pick one, you go and make that thumbnail for me, right?
0:17:08 And now it’s just like, I made the video, I give it the transcript, and it, you know,
0:17:13 like I’m the producer role and all the little roles under me know exactly what they’re supposed
0:17:16 to go do to complete the rest of the process.
0:17:21 And that to me is like such an exciting world because it’s like, it’s not really taking
0:17:25 away a lot of the creative tasks that I enjoy doing, it’s taking away more of the monotonous
0:17:27 tasks that I don’t enjoy doing, you know?
0:17:31 I feel like this world has been coming for a while and it’s refreshing.
0:17:36 Like any of you Harry Potter fans or watch any of the Harry Potter movies?
0:17:37 Oh yeah.
0:17:42 So do you remember in the last movie when Dumbledore walks up to all the paintings and gives them
0:17:46 a series of complicated tasks to secure the castle?
0:17:51 And then the character in the painting left the frame, went into the wall and basically
0:17:57 did autonomous things around the castle, like setting up shields, notifying the white people.
0:17:59 That’s what I think we’re going to be all having.
0:18:04 Like that fantasy, that magic of talk to the painting, the painting will go do something
0:18:09 on your behalf, that’s like here now and I’m excited for everyone to have magic.
0:18:10 Yeah.
0:18:12 That’s such a good analogy.
0:18:13 My wife’s a huge Harry Potter fan.
0:18:14 I’ve seen them all.
0:18:17 I wouldn’t say that I’m like the biggest Harry Potter fan, but I’ve seen them all because
0:18:20 my wife and kids all watch them.
0:18:24 So I’ve seen them all at least three times, but no, that’s a great little analogy, a great
0:18:27 picture of what’s happening here.
0:18:31 And I mean, to me, that’s a super exciting world and I think it’s going to get to a point
0:18:38 too where you have the agent that is also like the CEO or producer who is like going
0:18:41 and telling each of these other agents to go do their role, right?
0:18:44 I think you’re going to have like multiple levels, right?
0:18:49 Where you just train your CEO and then the CEO agent goes and tells the other agents
0:18:50 what to do.
0:18:51 Yeah.
0:18:53 It’s like sci-fi and fantasy have like combined now.
0:18:56 Like these were things that were really sci-fi.
0:19:00 Like remember that magical feeling of like seeing a painting talk and interact and like
0:19:05 the fact that it could hear you, it could hear the new characters, the fictional characters.
0:19:06 That’s like, we have that now.
0:19:11 And if we add, you know, some of the other tools I know we’ll talk about like AI studio
0:19:16 or sorry, not AI, Act One, you can almost build that actual painting character and it’s
0:19:18 actually, it would look that way.
0:19:20 So we can take it the whole mile.
0:19:21 Yeah.
0:19:22 Yeah.
0:19:23 Yeah.
0:19:24 It’s kind of shocking to me like how fast humans adjust to all of this.
0:19:26 Like this stuff is purely magic.
0:19:30 Like if you really think about, if you step back and look at it, it is magical and people
0:19:31 just get used to it.
0:19:33 Like after a day, it’s like, yeah, of course it does that.
0:19:34 Whoa.
0:19:35 Right.
0:19:36 What do you know?
0:19:37 It’s like, what?
0:19:38 Yeah.
0:19:39 It’s funny.
0:19:40 It’s funny you say that.
0:19:41 Cause I, I recent, I mean, I actually read it several years ago, but I recently re-read
0:19:42 it.
0:19:44 There’s a obscure book called “Off to Be the Wizard”.
0:19:50 I don’t know if any of you have ever heard of it, but it’s about this kid who, he opens
0:19:56 up his computer one day and he finds this like mysterious file on his computer and he
0:20:01 starts looking through this file and realizes that he can like tweak things and it actually
0:20:03 tweak things in real life.
0:20:04 Right.
0:20:08 So he would like, he found like his bank account and the number that was in his bank account
0:20:11 and he added an extra zero and then he logged into his bank account and there was extra
0:20:12 zero there.
0:20:13 Right.
0:20:17 And so he figured out how to like tweak the real world by tweaking the code.
0:20:21 But then what he, but then like the, the feds caught onto him and said, this guy’s obviously
0:20:22 doing something illegal.
0:20:23 Right.
0:20:24 How does he have all this money?
0:20:27 Like, I don’t know how he did it, but there’s something weird going on.
0:20:30 And so he decides, I’m going to travel back in time.
0:20:35 So he goes back in time to the medieval age, but he goes back with his iPhone and all
0:20:40 of his, his computer devices and all that kind of stuff and he convinces everybody in
0:20:46 medieval times that he’s a wizard, but all he’s really doing is using today’s technology,
0:20:47 he has it back in medieval time.
0:20:49 And that’s the whole premise of the book.
0:20:54 And it’s amazing book, such a like a fun, fun read, but it’s like, that’s what, what
0:20:55 we’re seeing right now.
0:21:02 It’s like, if you went back a hundred years ago and started having a conversation with
0:21:04 them, people would think you were a freaking wizard.
0:21:05 Sorcery.
0:21:06 Yeah.
0:21:09 Like what power that they have.
0:21:15 Um, I feel like we’ve been using technology to kind of anapomorphize and create almost
0:21:16 every Greek God.
0:21:18 Do you remember Hermes?
0:21:25 He was the messenger God and his magical power was like sending a message instantly across
0:21:26 vast distances.
0:21:32 Now it’s like, you get a phone call and you’re like, I don’t want to answer it.
0:21:37 That was a God like power in earlier, you know, belief systems.
0:21:41 Like I just think that’s so interesting to your point that you made, you know, Nathan
0:21:44 around how quickly we get used to things.
0:21:47 That was, that was the land of gods and goddesses.
0:21:50 And then now it’s like the climbable.
0:21:51 Yeah.
0:21:52 Yeah.
0:21:57 And one thing I see coming too is like the, these sort of multi-step tool use things are
0:22:01 probably coming to like the Meta Ray Bands pretty soon as well, right?
0:22:02 Like it’s probably only a matter of time.
0:22:05 I mean, you probably even have more insight than we do on this, but it’s probably only
0:22:09 a matter of time where you just like look at something with your glasses and then say,
0:22:10 oh, that’s a cool backpack.
0:22:13 Go buy that for me on Amazon, right?
0:22:17 And then like, you know, you get an email confirming that this was purchased on Amazon
0:22:19 and it’s sitting on your door the next day, right?
0:22:23 Like combine the glasses with the ability to go do the shopping for you.
0:22:25 And it’s like anything you see in the real world.
0:22:26 I want that.
0:22:27 Okay.
0:22:28 I’ll go get it for you.
0:22:29 Right.
0:22:30 Like that’s coming as well.
0:22:33 I think I even seen like a, well, I haven’t seen a live example of this, but it’s a theory
0:22:40 I have that a lot of the next generation of ads are going to be for AIs to see.
0:22:46 And then that AI knows you’ve given it some allowance, some permissions, even like an
0:22:52 allowance like, hey, you’re allowed to spend up to this amount per month on what you know
0:22:54 about me, what I’ve approved.
0:23:00 And then the ad that that AI is going to see is going to hit your AI that’s trained on
0:23:03 your preferences and it will go make that purchase.
0:23:05 And then it just seems like it magically appears on your doorstep.
0:23:06 Yep.
0:23:07 Yep.
0:23:08 I could see that.
0:23:10 Well, let’s talk about some of this other really cool stuff because you obviously have
0:23:14 a background in, in like visual effects, dream works and stuff like that.
0:23:18 And we just got to see this runway take one, which if you’re listening and you haven’t
0:23:23 heard of runway take one, it’s this new feature, I believe they’re rolling it out.
0:23:24 I don’t have it in my account yet.
0:23:28 I believe it’s like rolling out fairly soon though, but it’s a feature where you can take
0:23:36 a video and sort of record yourself on video, feed it to this runway act one and then it
0:23:40 will make like a cartoon animation and it will be lipstick to you and it will follow
0:23:42 you the same emotion.
0:23:46 So like if you look happy, if you’re laughing, if you’re sad, if you’re angry, those emotions
0:23:51 theoretically will show up on the face of this cartoon character based on the emotions
0:23:54 that you had in your original video.
0:23:58 And I’m just curious, your thoughts because I know, you know, coming from a background
0:24:04 of like visual effects and Hollywood and all of that kind of stuff, I know there’s like
0:24:06 fear around a lot of these tools too.
0:24:08 So like what are your initial thoughts on it?
0:24:13 The people that you do know that are working in video and Hollywood, like what is the general
0:24:14 consensus?
0:24:15 Are people excited about stuff like this?
0:24:17 Are people really fearful about stuff like this?
0:24:19 Like where do your thoughts lie with this kind of thing?
0:24:23 Yeah, so it really comes down to two words, generalists versus specialists.
0:24:25 My generalist friends are thrilled.
0:24:27 They’re like, “This is so exciting.
0:24:28 This is all great.”
0:24:32 Like they’re already generalists and this is just another tool that they can add that
0:24:34 will enhance what they’re doing.
0:24:41 My specialist friends, specifically the ones I trained, aren’t thrilled because their specialist
0:24:44 skills were like facial rigging.
0:24:49 I got the whole job at DreamWorks and Pixar and Illumination Films, some of you rigged
0:24:54 a 3D puppet to allow it to emote and to create all those faces.
0:24:59 So like if you’re a specialist in the animation and film world right now, you’re not thrilled
0:25:00 with a lot of these AI developments.
0:25:03 In fact, you’re scared and angry.
0:25:06 But if you’re a generalist, you’re like, “Oh my goodness.”
0:25:10 And then another framework is size of studio.
0:25:16 So the larger studios are more specialist and the smaller studios have more generalists
0:25:17 as artists.
0:25:24 So you’re going to see smaller studios, smaller animation films, teams, companies, small studios
0:25:28 are going to adopt a lot of these tools because they already are wearing all the hats.
0:25:33 The larger studios are not going to want to adopt these tools because they have mostly
0:25:34 specialists.
0:25:38 So they don’t want to scare all their talent away.
0:25:40 They will replace their talent when they can.
0:25:41 Yeah.
0:25:46 So, I mean, being somebody that you’ve kind of had your foot in both worlds like the video
0:25:51 production world as well as the AI world, like do you have advice for the people that
0:25:52 are scared?
0:25:55 Like what kind of stuff do you tell the ones that are sort of freaking out, maybe more
0:25:56 of the specialists?
0:25:57 Yeah.
0:26:02 Right now the main advice I give to like my fellow specialist friends that have deep skills
0:26:06 is they have to start actually using these tools.
0:26:11 They can’t just order anymore unless they only want their skill to be a hobby.
0:26:15 If they just want to be a hobby, then don’t change a thing.
0:26:21 But if their idea is you want to also have a career in this thing, you cannot roll your
0:26:24 eyes at the latest AI update anymore.
0:26:28 That’s what’s been my advice.
0:26:34 The other one is they could double down or triple down on purely human-generated content.
0:26:39 They can go like the craftsman, artisanal, Etsy.
0:26:43 I also recommend that to some artists that are really anti the AI stuff.
0:26:47 If they go down that route, they have to basically increase the prices dramatically.
0:26:49 It has to be a luxury product.
0:26:53 If their idea is like, “Oh, I want to make my artwork and my film really accessible to
0:26:57 lots of people and I’m going to make it from scratch by the act,” the only people that
0:27:03 I believe are going to be able to afford that are high-end luxury clients.
0:27:08 So it’s like if they pick up a general audience, they might have to use AI tools now and they
0:27:10 can just be like, “Oh, no.”
0:27:11 Yeah.
0:27:15 From what you’ve seen, how close do you think we are to these tools being able to completely
0:27:17 replace riggers and things like that?
0:27:24 It depends because I used to think that there was a threshold of quality that was required
0:27:25 to be met.
0:27:30 I don’t believe in that anymore because I’ve learned that quality content is relative.
0:27:39 There’s some people who find that Skippity Poop need or animation, there’s a certain
0:27:44 audience that actually finds that to be quality animation and quality content.
0:27:46 The rigs on those things are terrible.
0:27:51 The rigs, the way that the mouth blends and the rig is terrible.
0:27:55 If you showed that to an artist at Pixar or Disney or DreamWorks, they would say, “That’s
0:27:56 a terrible rig.
0:27:58 No one’s going to watch that,” but then they do.
0:28:03 So I think you can use a lot of these tools today, like a creative person can use these
0:28:09 tools to tell stories and provide value today because I think value is subjective.
0:28:15 I get a lot of anger online from saying that some of my more film and specialist friends
0:28:21 want to say that value is not relative, that there’s a difference between quality content.
0:28:24 I kind of disagree with that, unfortunately.
0:28:25 Yeah.
0:28:26 I can see that.
0:28:32 I mean, I look back like 20 years ago and the videos that were going viral 20 years ago
0:28:39 on the internet were silly stick figure videos of stick figure death videos of stick figures
0:28:44 falling off cliffs and then sort of splatting and had sort of cartoony blood and stuff like
0:28:45 that.
0:28:49 But they were the most basic looking animations and those were the videos that everybody was
0:28:52 looking at that were going viral because they were humorous.
0:28:58 They had some sort of story and there was comedy and the sort of visual aspect of it
0:29:01 was sort of the last priority of it.
0:29:05 But people loved that stuff and they would go viral and there’s examples of stuff like
0:29:06 that today.
0:29:09 You can find YouTube channels that are, I mean, South Park, right?
0:29:11 Look at South Park.
0:29:16 It’s like fairly crude looking construction paper animation and it’s still one of the
0:29:18 most popular comedy shows today.
0:29:19 That’s an ideal example.
0:29:20 Yeah.
0:29:24 I mean, just to add to that, same point, the type of animation that we do at a lot of
0:29:29 animation studios called an animatic and for those that are half familiar, it’s a really
0:29:34 rough pass of an animation that gets its story down, but the actual display and the actual
0:29:41 audio that you’re hearing is usually temporary or in flux or being changed out a lot.
0:29:46 And my proof that I hope more big studios actually listen to is that we would do audience
0:29:52 testing for films with just the animatics where basically no shot is done and they’re
0:29:53 all at different stages.
0:29:58 Some stuff is fully rendered in 3D, some stuff is just hand sketched, some stuff is basically
0:30:04 an early comic, but when you see the audience reacting to it, they love the story, they’re
0:30:06 laughing at the right times, they’re responding.
0:30:14 I wish more DreamWorks artists, Disney artists in film, TV in general, film in general, acknowledged
0:30:18 an animatic can actually work on its own.
0:30:24 There’s certain situations where you might not need to spend $300 million on rendering
0:30:28 to tell that same story that the audience loved.
0:30:32 They loved the story, especially when there’s a lot of talent behind it and that was very
0:30:33 eye-opening.
0:30:36 We had a whole, we bought a big elementary school once to do an early screening of How
0:30:41 to Train a Dragon 3 and it was only an animatic form and the kids loved it.
0:30:47 They were laughing and cracking up and the shots would be stick figures at times, rough
0:30:52 sketches and then other times it’d be a fully 3D rendered model and I’m sitting there thinking
0:30:56 like, “Don’t they know that it’s not done, the film’s not done yet?
0:30:58 How do they enjoy it so much?”
0:31:00 It didn’t matter.
0:31:04 It feels like all of this really levels the playing field and I think it’s going to allow
0:31:08 some of the new kinds of films to be created because I spent a little bit of time in Hollywood.
0:31:12 I was partners with Bear Osborn, the producer of Lord of the Rings and The Matrix.
0:31:16 It’s been about a year and a half trying to create a movie studio together.
0:31:20 This crazy scenario where we had a friend who was an early crypto investor and he was
0:31:24 going to help fund things and pulled together capital and so I spent a year and a half going
0:31:28 out to New Zealand, going out to Hollywood, meeting all these people and he would take
0:31:33 me on like a Disney movie set and had all these different heads of departments tell
0:31:37 me what their job was and explain everything to me and answer any questions.
0:31:41 I was just, you know, it was shocking how much money all this cost.
0:31:46 To make a film now, when they greenlight a film, it’s almost always a sequel or something
0:31:47 like that, right?
0:31:53 Or based on some existing IP that people love because it costs $300 million to make, right?
0:31:57 You can’t take any creative risk with $300 million, right?
0:31:59 Or very little.
0:32:02 And so with this kind of technology, you’ll start getting the cost of films possibly down
0:32:07 to millions or tens of millions and you can take a lot more risk and you’ll see new kinds
0:32:10 of stories emerge that would have never happened before.
0:32:14 So what I’m like, you know, as a, as a consumer, very excited about this, see like actually
0:32:15 new stuff.
0:32:18 It’s not just a sequel of a sequel or based on some book that I never read or.
0:32:19 Yeah.
0:32:20 Yeah.
0:32:21 Well, it totally democratizes.
0:32:25 I know that’s a buzzword, but it totally democratizes like video creation, right?
0:32:29 Like there’s so many people out there that are probably amazing storytellers but have
0:32:32 like no skills in visual effects, right?
0:32:38 Well, now this stuff is allowing them to at least roughly tell their story in some way,
0:32:39 right?
0:32:41 And that’s, that’s what’s exciting as well, right?
0:32:46 Because I’ve never been very technical when it comes to a lot of like, like I don’t know
0:32:48 how to use After Effects very well.
0:32:55 I don’t know how to use any of the like blender or any of the like 3D animation tools or any
0:32:56 of that kind of stuff.
0:32:58 I think I can tell good stories.
0:33:01 I think I can make decent music.
0:33:06 And if I want to put video over it, I now have options to sort of do something with the video
0:33:12 portion of it, where I never felt like I had the skills to be able to do that before.
0:33:16 You know, are my videos going to look as good as somebody who’s been doing it in Hollywood
0:33:17 for the last 30 years?
0:33:22 No, but I can, I could get something out into the world that prior I wasn’t ever able to
0:33:23 get out into the world.
0:33:25 And to me, that’s what’s most exciting about it.
0:33:29 One thing, Don, one thing I wanted to ask, and I don’t know if this is stuff that you’re
0:33:30 allowed to share.
0:33:34 So if you can’t just don’t worry about it, we’ll skip past it, but you’re one of the
0:33:41 few people that I know that actually has been able to use Sora already, but I’m curious
0:33:47 if you’re able to share any of your experiences actually getting to use Sora.
0:33:48 Nothing I can share yet.
0:33:49 Yeah.
0:33:50 No worries.
0:33:51 Had to ask.
0:33:53 When I can, I will share a lot.
0:33:54 Very cool.
0:33:55 What about Metta?
0:33:59 So we saw you and I were both at the MetaConnect event this year.
0:34:04 We both got to see the Orion glasses and all of the stuff they’ve been working on.
0:34:09 One of the coolest things I saw on Instagram was your first-person view of you walking
0:34:12 out onto stage and shaking your hands with Mark Zuckerberg.
0:34:17 I’m like, there is literally no better use case for these Metta Ray-Ban glasses than
0:34:20 showing yourself walking out on stage in front of a huge audience and shaking hands with
0:34:21 Zuckerberg.
0:34:25 Literally, that is the ultimate use case for those glasses right now.
0:34:28 But I’m curious, how did that whole thing come about?
0:34:34 I know you’re one of the first people who’s used the AI.
0:34:38 What’s the feature called where it’s like an AI version of yourself that people can
0:34:39 go and chat with?
0:34:42 You’re one of the first people that’s gotten to use that.
0:34:44 How did that all come about?
0:34:45 Yeah.
0:34:47 Two stories there.
0:34:52 The product, that AI product is called Creator AI and it’s part of AI Studio within Metta.
0:34:57 They have the ability to allow creators to build a custom AI model that’s trained on
0:35:00 your social media data that you consent to.
0:35:04 I could check a list of what things it’s allowed to learn from and train from, and then it’s
0:35:06 added into its knowledge graph.
0:35:11 Kind of similar to building a custom GPT if that GPT updated on the fly based off of how
0:35:12 I use social media.
0:35:16 If I post a thread tomorrow, that will be part of its brain.
0:35:21 If I leave a really long comment replying to an answer to a common question that’s also
0:35:26 now added to its brain, it really speaks and communicates very much like me.
0:35:33 When it came to filming with the Raymans and shaking Mark’s hand on stage, I actually
0:35:38 did not get permission to film that or to share that.
0:35:43 I decided I thought this would be a really appropriate thing, but I was nervous they
0:35:45 were going to say no if I asked.
0:35:51 I just was like, “I’m going to film this and I’m going to share it.”
0:35:54 I shared it right when I got onto a plane.
0:35:55 It was posted.
0:35:59 I could see any feedback for several hours and I was like, “I’m either going to get
0:36:04 a lot of anger when I land or a lot of joy,” and then there was a lot of joy.
0:36:06 They’re like, “This is a great use.”
0:36:13 I was like, “Yeah, the idea there was like, they talk about the advantages of that kind
0:36:15 of form factor.
0:36:19 I’ve been using their Rayman glasses I think since 2021 with their first versions that
0:36:23 came back in was called the Rayman Stories.
0:36:29 I like them because it’s a less distracted viewing experience, less distracted capturing
0:36:30 experience.
0:36:34 Believe it or not, the old versions of these glasses, when I proposed to my wife, I wore
0:36:43 the glasses while I proposed to her and got my first person view of her response saying,
0:36:44 “Yes, thank goodness.”
0:36:50 I mean, to be fair, it probably would have gone viral either way.
0:36:51 Right.
0:36:53 There’s certain moments where I’m like, “Okay, this makes a lot of sense.
0:36:56 It would have been really inappropriate for me to pull out my phone when I was proposing
0:37:01 to her and put the phone in front of us like, “What’s your answer to this?”
0:37:07 They were the transparent ones, so they weren’t the shades.
0:37:14 My eyes, we were looking at each other’s eyes and we could still capture that memory.
0:37:19 The reason why Meta reached out to me for those products is it’s those concept videos.
0:37:24 Believe it or not, I made those theoretical concept videos, I posted on Instagram of what
0:37:28 I want to see out in the world, and I try to do the opposite of the show Black Mirror.
0:37:34 A lot of people will try to tie their technologies to more dystopias, and then I’m actively not
0:37:35 doing that.
0:37:40 I’m trying to come up with creative positive uses, so they see that consistently and I’m
0:37:45 just doing it on my own that they’re like, “Wait, maybe if we support him, maybe he can
0:37:49 do this more,” and they have, so I’m like, “Oh, great.
0:37:53 I can tell more positive stories and I’ll use your tech to do it.”
0:37:54 Super cool.
0:37:57 Well, there’s one last gadget I want to ask you about because you mentioned it before
0:37:58 we hit record.
0:38:02 You’re wearing like the little blod thing, right?
0:38:06 Basically it’s a device that records all day long and keeps notes for you, right?
0:38:07 Is that what it is?
0:38:11 It’s a blod AI note pen, it has a magnetic back.
0:38:14 You can wear it like a lapel on your shirt, like a button.
0:38:18 It also comes with like a necklace, so you can wear it as a necklace, and it also has
0:38:23 like a wrist worn interface, so it looks like a watch and you can kind of wear it there.
0:38:24 And a clip.
0:38:28 They have a magnetic clip, so you can pin it onto a bag or something.
0:38:33 But yeah, what it does if it takes a listen and summarizes your day, you get a nice transcript
0:38:39 out of whatever you record, and then their tool actually then summarized it and organized
0:38:41 it based off of some templates.
0:38:49 So you can make it for meetings, make it for conferences, make it for random tasks, and
0:38:54 it’ll reorganize that transcript and give you a nice summary.
0:38:55 I love it.
0:38:57 It’s useful and they have a card version as well.
0:39:00 So if you’re just like at a conference all day long, you can just wear it.
0:39:03 It will take notes on every speaker you see.
0:39:04 It’s for conferences.
0:39:09 So right now I’m at the Masters of Scale conference, and all these speakers, I can’t actually process
0:39:11 how much information they’re sharing.
0:39:12 It’s a lie.
0:39:18 So you turn on the plot AI note pin, and I can trust that it’s going to ingest all of
0:39:21 this, and then give me a transcript.
0:39:26 And if you’re not pleased with the summary that it makes, you have that transcript.
0:39:30 Copy it, paste it into your language, you know, your large language model of choice,
0:39:35 and then say, you know, what was like the three biggest takeaways today?
0:39:40 And it does a great job at quickly and instantly giving you the three best takeaways.
0:39:45 And if you don’t like them, you can say, well, actually, any others, you know, and like you
0:39:47 can just have a regular conversation.
0:39:53 It’s like having a journalist or an assistant with you who can just document your day.
0:39:58 And you know, that’s super helpful if you’re a content creator and you are trying to ingest
0:39:59 a lot of things.
0:40:00 I can research all this stuff.
0:40:07 I can help, you know, alleviate that and I can review it saves you from having to carry
0:40:13 a notepad around and take notes all day, which for me, I love that.
0:40:15 So it seems like you’re like a pretty big gadget guy.
0:40:16 I love gadgets.
0:40:19 I’ve got like pretty much every AI gadget that’s come across.
0:40:20 I’ve got it.
0:40:21 I’ve got the rabbit.
0:40:22 I’ve got the plot.
0:40:23 I’ve got the compass.
0:40:25 I’ve got the I’ve got like all the things.
0:40:26 The plot is actually on the way.
0:40:30 I don’t have it yet, but I got a notification that it shipped like a couple of days ago.
0:40:34 But are there any other like really, really cool gadgets that you’re that like more people
0:40:35 should know about?
0:40:37 Yeah, it’s the actually I’m wearing it right now for those that don’t see it.
0:40:41 It’s the Hollyland Lark M2 microphone.
0:40:43 I recommend it because two reasons.
0:40:47 One, it’s got AI noise canceling built right in.
0:40:52 So and it comes with two lapels so you can like have two people do an interview and it’s
0:40:54 very good quality sounding mic.
0:40:57 And then let me just adjust it.
0:40:58 Yeah.
0:41:06 The other thing on that is iOS 18 on iPhone now has a new voice memo feature that has
0:41:14 been and now I’m sitting here like, do I still need to use my flawed for everything?
0:41:20 If I could just have that right in the voice memos app and it saves your transcript right
0:41:27 alongside the audio file and then searchable so you can just search in text your audio
0:41:28 file.
0:41:31 It looks for the word and then it plays the word.
0:41:36 It’s like a like a little like lyrics videos where like the word pops up and you hear this
0:41:37 lyric.
0:41:42 It does that in the freaking built in iPhone and iPad and Mac.
0:41:46 I think an all all iOS 18 devices can do that now.
0:41:49 I’m sitting there like, oh, shoot, okay.
0:41:50 Huh.
0:41:55 So right now I’m kind of using them both for note taking but we’ll see what’s best.
0:41:59 You could compare notes to pull them both into a large language model later and be like,
0:42:01 Hey, what did this one find that this one didn’t find?
0:42:04 What is this here that this one did audio quality is different?
0:42:07 You know, the plot one is not for your audio quality.
0:42:09 It’s a very low quality mic.
0:42:13 It’s high enough to get the details of what happened, but not high enough that you’d use
0:42:15 it professionally for audio.
0:42:24 Whereas the the Hollyland Lark M2, you can actually use that audio as well as, you know,
0:42:27 that you use it as a tool for capturing the data.
0:42:28 Right.
0:42:29 Right.
0:42:31 I imagine the plots probably like optimized for battery life because they expect people
0:42:36 to just have it on all day long, right, where the other one that’s probably more designed
0:42:40 to be turned on and turned off as you get another device.
0:42:41 I recommend.
0:42:43 I’m using it right now for my tripod.
0:42:49 It’s the the Insta 360 flow pro.
0:42:50 It’s nice.
0:42:53 It’s I have it attached to my iPhone right now.
0:42:58 And it’s a robotic arm that can actually do the person tracking.
0:43:01 And so I’m making a lot of content on the go.
0:43:04 And it’s like having a kind of a camera operator with me wherever I go.
0:43:07 And then it folds out three legs.
0:43:08 So it’s also a tripod.
0:43:10 It has an extending neck.
0:43:13 So it looks like a selfie stick.
0:43:18 And with the top of a button, you can orient it as horizontal content or vertical content
0:43:20 kicks butt.
0:43:24 So and it’s everything fits in this tiny little bag carry with me.
0:43:28 Somebody should just make like a somebody should just make a bag like a like a fanny
0:43:32 pack type bag where the whole bag is just like a wireless charger and anything you throw
0:43:33 in the bag.
0:43:34 It just charges it up.
0:43:35 Somebody should make listeners do it.
0:43:36 Please.
0:43:42 I mean, I think we covered a lot of ground on this one.
0:43:46 I know you’re at a conference and probably anxious to get back in and hear what everybody’s
0:43:47 talking about.
0:43:50 Is there any place that people should go and check you out?
0:43:56 But I know Insta Instagram, you post a lot of amazing videos and some of these concepts
0:43:58 over there.
0:43:59 Shout out your Instagram account.
0:44:00 Yeah.
0:44:01 Instagram is my go-to.
0:44:02 I do a lot of live streams.
0:44:03 I share a lot of stuff on threads.
0:44:04 A lot of reels.
0:44:09 So it’s it’s at D-O-N-A-L-L-E-N-I-I-I.
0:44:14 I am Don Allen Stephenson, the third by go by Don Allen, the third on Instagram and Twitter.
0:44:16 But but yeah, just Don Allen, I-I-I.
0:44:18 And you have a new book.
0:44:20 I’ve got it in my hand here.
0:44:25 Take a seat, quick elevator pitch, 30 seconds.
0:44:26 What’s the book about?
0:44:27 Why do people want it?
0:44:30 I talk about how I make opportunities for myself.
0:44:33 The whole idea of making a seat is how do you make opportunities?
0:44:35 And I use a lot of AI to do that now.
0:44:40 So I decided to kind of write a book in three parts, how to discover opportunities with AI,
0:44:44 how to leverage technology like AI, and then also how do you build resilience?
0:44:46 There’s a lot of change that’s about to happen.
0:44:49 And I figure, let me share a bunch of my life stories and tools and techniques and put it
0:44:50 all into a book.
0:44:56 I train in AI to interview me and then a separate AI to organize that into chapters and a third
0:45:02 AI to organize the chapters into a book written in two months, about 30,000 words.
0:45:04 And I stand by it.
0:45:05 Very cool.
0:45:11 So it’s a book co-written by Don Allen Stephenson, the third and a very much so.
0:45:12 Awesome.
0:45:18 But well, thank you so much for taking time out of the masters of scale of it to come
0:45:19 chat with us.
0:45:21 It’s been so much fun.
0:45:25 Always a blast nerd and out with you about AI and tech and gadgets and stuff.
0:45:27 So once again, thanks so much for joining us today.
0:45:28 Really, really appreciate it.
0:45:29 It’s been fun.
0:45:30 Likewise.
0:45:31 And thank you so much for having me.
0:45:32 I really appreciate it.
0:45:33 This is so much fun.
0:45:34 And yeah, all the stuff that you do inspires me.
0:45:36 I reference your work all the time.
0:45:40 And it’s just like, it’s so cool that we can be even bumping into the same circles over
0:45:41 and over again.
0:45:42 Absolutely.
0:45:43 I appreciate it, man.
0:45:44 Good talking to you.
0:45:45 All right.
0:45:50 Thank you so much for joining us.
0:45:51 Thanks.
0:45:52 Thanks.
0:45:53 Bye.
0:45:54 Bye.
0:45:55 Bye.
0:45:56 Bye.
0:45:59 [MUSIC PLAYING]
0:46:01 you
0:46:03 you

Episode 31: What will the future look like when AI agents take over mundane tasks? Matt Wolfe (https://x.com/mreflow) and Nathan Lands (https://x.com/NathanLands) are joined by Don Allen Stevenson III (https://x.com/DonAllenIII), former DreamWorks specialist and author of “Make a Seat”.

In this episode, they dive deep into how AI agents are set to revolutionize our everyday lives by speculating on future AR products, and exploring AI-driven automation in advertising. Don Allen Stevenson III also shares insights from his book on leveraging AI and technology during times of change, and how he used AI to write and organize it.

Check out The Next Wave YouTube Channel if you want to see Matt and Nathan on screen: https://lnk.to/thenextwavepd

Show Notes:

  • (00:00) Former DreamWorks trainer, now independent content creator.
  • (04:04) Inspired by Black Mirror, created hopeful Clear Mirror.
  • (07:00) Setup ChatGPT voice shortcuts on Apple devices.
  • (09:55) Encountered rate limit issues but found automation impressive.
  • (15:27) Analyze top YouTube thumbnails for design inspiration.
  • (21:47) Runway Take One: Video to emotion-matched cartoon.
  • (23:29) Smaller studios embrace AI; larger ones resist.
  • (28:01) Animatics reveal audience engagement through rough storytelling.
  • (29:41) New tech democratizes film creation and opportunities.
  • (33:25) Meta’s Creator AI customizes models using consented data.
  • (37:44) AI note-taking tool aids conference information management.
  • (39:35) iOS 18 voice memos: transcribe, search audio.
  • (43:08) Creating opportunities and resilience with AI insights.

Mentions:

Check Out Matt’s Stuff:

• Future Tools – https://futuretools.beehiiv.com/

• Blog – https://www.mattwolfe.com/

• YouTube- https://www.youtube.com/@mreflow

Check Out Nathan’s Stuff:

The Next Wave is a HubSpot Original Podcast // Brought to you by The HubSpot Podcast Network // Production by Darren Clarke // Editing by Ezra Bakker Trupiano

Leave a Comment