Build a Website Using Vibe Coding in 45 Min (GPT-4 & V0)

AI transcript
0:00:09 Hey, welcome to the next wave podcast. I’m Matt Wolf, and I’m here with Nathan lands.
0:00:14 And today we’re diving into vibe coding again. In fact, we’re bringing back Riley Brown,
0:00:19 one of the guys who sort of led the charge on vibe coding. We’re going to talk about what’s
0:00:24 changed in the world of AI coding since the last time we chatted all sorts of cool, amazing new
0:00:28 tools that make it even easier to code than it was before. So we’re going to dive into all of that
0:00:33 and make sure you stick around because we’re going to build a super fun app live on this episode that
0:00:38 I think you’re going to be pretty blown away by what it can do and how quickly we build it.
0:00:42 So super, super fun episode. So let’s dive right in with Riley Brown.
0:00:50 HubSpot just dropped their 2025 marketing trends report, and you’re going to want to see what they
0:00:58 found. Visual content is delivering 21% more ROI. Small influencers are building 45% more trust.
0:01:05 And AI is changing how fast we can create. And the best part, it’s not just another report full of
0:01:11 stats. It’s a game plan you can actually use. There are frameworks, AI guides to help you do more with
0:01:18 less and real case studies and playbooks from teams crushing it right now. Want to nail marketing in
0:01:22 2025? Go to click HubSpot.com slash marketing to download it for free.
0:01:28 Thanks again for joining us, Riley. How are you doing today?
0:01:33 I’m doing great. It sounds like a fun episode. I’m down. Yeah, I think it was maybe two or three
0:01:40 months since I’ve been on. And yeah, like you said, like six to 10 big updates per week. It feels like,
0:01:41 yeah, there’s a lot to cover.
0:01:48 Yeah. In the world of AI, I mean, three months is like three years of development. So quite a bit has
0:01:53 happened. Out of all of the developments that have happened in the last three months, like which ones have
0:01:55 impacted your coding the most?
0:02:03 Ooh, I would just say how well Cursor and Windsurf, which are the two tools that I use the most,
0:02:11 how well they understand the code base. Whenever a model changes, like recently it switched from
0:02:16 Cloud 3.5 Sonnet to Cloud 3.7 Sonnet when Cloud came out with their new model. And there’s this weird
0:02:24 period when these models change because Cursor was basically built for Cloud 3.5. And so it got really,
0:02:29 really good at 3.5. And once it switched to Cloud 3.7, it acted a little bit differently and it stopped
0:02:34 following all of my requests perfectly and happened for a lot of people. And then I think we saw a lot
0:02:40 of people who really love Cursor start switching to Windsurf. And so I think that both of those tools, I think
0:02:44 those are the two best tools to use for AI coding. If you want to like dig deeper, you can use a lot
0:02:50 of the simpler tools, lovable, replet, bolts, etc. But they just understand the code base really well
0:02:56 and you’re able to do longer tasks. I realized with the new Cursor Max mode, you can give it instructions
0:03:01 that takes a while. Like it’ll work for like seven, eight minutes at a time and you’ll come back and it’ll
0:03:06 be done. You’re like, oh my God, it’s not perfect. But yeah, I would say that’s probably the biggest
0:03:13 update. It’s not necessarily a headline, but yeah. Between Cursor and Windsurf, I have both. I’m
0:03:18 literally paying monthly for both of them and I switch back and forth. Like sometimes Cursor will
0:03:23 get hung up and get stuck in a loop and I can’t get it to solve a bug. And then I’ll open up the same
0:03:28 folder in Windsurf and Windsurf one shot fixes the problem. And then I run into the same thing with
0:03:33 Windsurf and I’ll jump back to Cursor and Cursor will fix it. To me, they’re pretty much the same,
0:03:39 but sometimes one figures it out while the other doesn’t. But like, what do you see as like the
0:03:46 biggest differences between the two? Honestly, I use Cursor more because of my habits. When I build an
0:03:52 app or I’m making content on TikTok or Instagram or YouTube, where I’m building an app, I’ll just come
0:03:56 up with an idea and I’ll be like, all right, I want to make it. And then my habits just open Cursor by
0:04:03 default. And so like, to me, I don’t see that big of a difference. I think Windsurf has a nicer user
0:04:09 interface, but I noticed it does get stuck on some weird things that Cursor doesn’t. But Cursor also
0:04:14 gets stuck on things that Windsurf doesn’t. So I think I had a tweet two days ago about how like the
0:04:19 cynical AI crowd will tell me they’re like, oh yeah, good luck when you get a bug. And then I wrote a list
0:04:24 of like these like seven things I do whenever I get a bug. And it’s like almost guaranteed to beat
0:04:30 it because you can switch AI models. If all of the AI models don’t work, you can switch to a whole
0:04:37 nother software. That’s also really good, Windsurf. And then you can use the internet. And now with MCPs,
0:04:42 I think probably the second biggest thing is the fact that I have like four or five MCPs built directly
0:04:47 into Cursor that surfs the internet, which is actually just built directly into Cursor, but you
0:04:53 can actually set it up to do perplexity deep research directly in Cursor because perplexity released an API
0:04:58 that you can use via MCP. This might be kind of some jargon if you’re new to this, but basically you can
0:05:05 search the internet directly from Cursor. So you can say, I want to build this app that requires to use,
0:05:11 let’s say you want to use the Deepgram API for speech to text. You can actually have Cursor or Windsurf
0:05:16 search the internet, come back with that information, and then look at your code base
0:05:22 based on that information. And then it can generate the code with the information. So it’s like a research
0:05:30 agent and that has been really impactful and really cool. Yeah. So basically in MCP, for anybody listening,
0:05:34 it stands for model context protocol. It was created by Anthropic. We actually did an episode where we
0:05:40 talked a little bit about it, but it’s sort of a layer that lives between like the API of various tools that
0:05:45 you might want to connect with and your large language model so that it sort of makes it more
0:05:51 of a uniform way that the large language models can communicate with tool APIs. So basically every
0:05:57 single API is like a little bit different. Every single company’s API has these full-on docs of how to use
0:06:04 their API and nobody really quite does it the same way. And using MCPs standardizes that. So the large
0:06:10 language models communicate with the MCP and then the MCP goes and communicates with the APIs on the behalf of the
0:06:15 large language model. That’s probably the simplest explanation I can give, but you know, in even
0:06:20 simpler terms, the MCPs make it easier for the large language models to use tools. Yeah. So that’s
0:06:24 essentially what we’re talking about when we talk about MCPs. Totally. Yeah. I think that was a good
0:06:29 explanation of it. Yeah. So when it comes to the MCPs, so I started using Superbase. I know you’re not a
0:06:36 fan of Superbase. I think you’re more on the Firebase train, but Superbase has an MCP where I can build my
0:06:41 database and then Cursor can actually go and look at the database and double check that things are
0:06:46 working properly on the database side of things. And there’s also a browser tools, one that I’ve been
0:06:50 using that can actually look at your console on your browser and take screenshots of your browser. So it
0:06:56 can sort of double check its work. But those are really the only two MCPs that I’ve even used at all.
0:07:01 And I’m actually just now learning that perplexity MCP works well as well. So I’m gonna have to add that
0:07:05 into my mix. What are some of the other ones that you’re using though? Yeah. So there was one that I
0:07:11 used. It stopped working for me. So this is another thing about MCPs right now. I think it’s incredibly
0:07:18 early. And I think if you’re not technical, unless you’re creating content on it, like I am, or you are
0:07:23 like, it might be perfectly okay to wait a little bit because they’re not perfect. There’s a ton of
0:07:29 potential. And then there’s like so many, I think there’s like four different YC companies in this batch
0:07:34 that are literally trying to make it easier as simple as just one click at an MCP. Because right
0:07:40 now you have to like copy and paste code, you have to get different API keys. And so like, if you don’t
0:07:44 understand MCPs, if you want to like get ahead, I think there’s a lot of business opportunities in
0:07:49 MCPs, but like, don’t feel like you’re falling behind anything big because it’s still in its infancy
0:07:56 stages. And so one that was really useful for me that I did try was there’s a fire crawl MCP that can
0:08:03 literally like crawl a website and like figure out how it’s designed and then build the UI like down to
0:08:09 the pixel perfectly, which was one for design. But yeah, there’s all kinds of different ones too.
0:08:15 Yeah. I don’t know if Sam Altman posted it today or yesterday, but apparently open AI is releasing an
0:08:21 MCP as well. And I’m really, really hoping they release one that can communicate with their new
0:08:27 image gen model. Cause that would be really cool to actually communicate with an MCP inside of cursor
0:08:32 and have it generate, you know, image assets and things like that directly from within cursor without
0:08:37 even having to leave. So that to me is also really exciting. It’d also be really exciting if, you know,
0:08:41 it works with things like a one pro, although I think that’ll get really, really expensive, really quick.
0:08:46 If you’re tapping into the O one pro API as Nathan probably knows, cause he’s played with it quite a bit
0:08:52 more than I have. Yeah. The O one pro it’s like $600 per million output token, right? It’s insane.
0:08:57 Something like that. Yeah. Yeah. Which I think is why cursor windsurf haven’t rolled it out yet. And if
0:09:02 they do roll it out, I imagine it’s probably going to be a bring your own API sort of situation to be
0:09:07 able to use it. I’m not sure why they rolled that API out as of right now. Like there was a short window
0:09:11 where O one pro was absolutely amazing. And on some benchmarks, I still think it is number one or
0:09:16 number two now, but now like the difference is so small. I don’t see why anyone would pay that much
0:09:23 to use API. Yeah. Yeah. I mean, I’ve been hearing a lot of rumors about deep seek R2 and how it is
0:09:30 going to be as good as O one pro and you know, it’s probably going to be a hundred times cheaper,
0:09:36 which I mean, that’s going to cause open AI’s costs to plummet. Yeah. And it will probably be open source
0:09:40 and you’ll probably be able to run it straight through something like Grok, G R O Q Grok to
0:09:46 really, really crank up the speed. And we also now have Gemini 2.5, which just came out from Google,
0:09:51 which a lot of people are saying is sort of on a similar level to O one pro also.
0:09:56 Yes. Gemini 2.5. Honestly, I’m going to be honest. I have not tested Gemini that’s
0:10:03 on my main list. I have been obsessed with how Sam Altman and open AI. They’re so good at
0:10:10 dominating the narrative. Like Google released yesterday, Google released arguably the best
0:10:17 model for coding. And we’re at a time where vibe coding is at its peak in terms of popularity.
0:10:22 Vibe coding as a niche has dominated Twitter. And then on the day they released the best
0:10:28 tool for vibe coders. No one’s talking about it because Sam always releases it right before.
0:10:34 And I go back, like Sam has done this since the beginning, like as a troll to Google, every time
0:10:38 they have something lined up, it’s just like, uh, open AI releases something that they’ve been
0:10:42 holding in their pocket. Yeah. I think we should talk about the image model. I think it’s really big
0:10:48 for vibe coding because I think a lot of people who vibe code love it for design. And this image model can
0:10:56 generate perfect design. If you ask it to create an iPhone app layout, it’ll do it like perfectly to
0:10:59 the pixel and make it incredibly beautiful too. Right? Like we had an episode before we’re talking
0:11:02 about like AI is going to eventually make the web fun again. Like we were talking about that,
0:11:07 like, cause like designs have gotten kind of boring. I saw a screenshot yesterday of somebody doing,
0:11:11 it was like Studio Ghibli web app, like redesign my web app in Studio Ghibli. I was like,
0:11:18 Oh, that’s way better. Actually. Somebody made a whole like Wikipedia page inside of the new chat
0:11:24 GPT as well. And it looked identical to a real Wikipedia page. Yeah. Yeah. It’s insane. And so
0:11:31 a lot of the reason why people use tools like Figma are to ideate and get their ideas out because like
0:11:36 a lot of people are very visual, but imagine GPT 4.0, their new image model, but imagine if it was
0:11:40 five times faster and you can, it’s safe to assume that it’s going to reach that point within,
0:11:45 by the end of this year, it’s probably safe to think that that’s how fast this is going. And so
0:11:50 if it’s three to five seconds per image generation, imagine instead of having to like draw everything
0:11:55 out manually, you’re just like, no, move this bar slightly lower, create an image. And when you
0:12:00 generate an image on GPT 4.0, pay attention to how it does it and pay attention to how mid journey
0:12:05 does it. If you look at mid journey, they use diffusion. And so it goes from like a jumbled mess
0:12:10 into like really refined where open AI, I forget what even what it’s called auto regressive or
0:12:15 something I don’t even know, but it like goes top to bottom. So it literally generates the top row
0:12:20 of the image. And then it’s like pixel perfect. It generates all the way down. Yeah. And so I think
0:12:26 we’re going to see a design company come out of this. That is just a multi-billion dollar company
0:12:31 that just uses like voice to like do mock-ups and stuff. It’s a big deal.
0:12:36 Yeah, no, it literally feels like you’re sort of standing over the shoulder of a designer and saying,
0:12:39 all right, Hey, fix this for me. Hey, change that. Hey, add this text.
0:12:43 It’s crazy how it seems like really understand images so much more like to Riley’s point,
0:12:47 mid journey can make really beautiful stuff, but it doesn’t seem to really understand what you’re
0:12:52 asking it. Yeah. Right. We’re like, but 4.0 does actually seem to get what you’re talking about
0:12:57 to like a crazy level. Yes. So I was actually using it where I made thumbnails for my YouTube
0:13:02 video. The thumbnail at the top here that you can see was actually one that my designer made,
0:13:07 but I took the one that the designer made. And then I started step-by-step tweaking it to get
0:13:13 something more out of it. So I wanted it to like change my facial expression and put a WTF in it
0:13:18 instead of worth it. And you can see it made pretty much the same thumbnail, but just change the text and
0:13:23 change the face a little bit. And then I said, make me not smiling. And then it made the same
0:13:29 image, but with a frowny face on it. Right. And then I basically prompted it again to get another
0:13:33 variation. This one’s sort of looking to the camera, this one’s sort of looking more at the device.
0:13:38 And I was just sitting there like saying, Hey, change this thing, change this thing. And it was
0:13:43 sort of going through the process and changing them all. I had this image here of me holding up all
0:13:49 these devices. And then I said, add the words worth it. And it put it twice. And one of them was sort
0:13:55 of messed up, but then I eventually got it to do it just one time. Right. And then I asked it to close
0:14:00 the mouth and then it started to look less and less like me, unfortunately. But you know, I was sitting
0:14:05 here just prompting different tweaks that I wanted to make to the image. And it was just adding the
0:14:10 tweaks and making better and better thumbnails for me. I threw in some images of like the Padres
0:14:15 baseball season starting. And I told it to make this image in GTA five style. I told it to make
0:14:20 it in Rick and Morty style. And then I was sending these to some of my friends who are not Padre fans
0:14:26 to sort of troll them a little bit. I had it making infographics for me earlier. Here’s a Venn diagram
0:14:32 that I had it make. I was having it make like studio Ghibli style and Simpson style of like family
0:14:36 photos. That’s a huge business, by the way. I think somebody should be out there like selling that to
0:14:41 mom and pops and like individuals right now. 10 hours after they release their API, we’re going
0:14:47 to see 10,000 of the same rappers released. Yeah, I might be one of them for fun. I put an image of
0:14:55 myself, South Park version, Minecraft version, pixel art, video game version, 3D voxel version.
0:15:00 I’ve been nerding out over this, like pretty much my whole day so far today has been spent playing with
0:15:04 this stupid tool. So it’s mid journey screwed or like because they were supposed to come out with V7,
0:15:08 right. And V7, the big thing of V7 was it was going to be more consistent. It was finally going
0:15:12 to understand your images better. And it’s like, well, that’s what this is. So I guess today was
0:15:17 the office hours of mid journey. And I saw people live tweeting during the office hours. And I guess
0:15:22 David from mid journey was on there going, oh, the chat GPT model is not that good. I don’t know why
0:15:27 everybody’s so excited about it. It’s not nearly as good as what we’ve been building and like just sort
0:15:32 of this very, very negative tone towards it. And then like all the replies on that tweet were like,
0:15:40 this sounds like copium to me, you know, like we’ll be right back to the next wave. But first,
0:15:43 I want to tell you about another podcast I know you’re going to love. It’s called Marketing Against
0:15:49 the Grain hosted by Kip Bodnar and Kieran Flanagan. It’s brought to you by the HubSpot Podcast Network,
0:15:54 the audio destination for business professionals. If you want to know what’s happening now in marketing,
0:15:58 what’s coming and how you can lead the way, this is the podcast you want to check out.
0:16:02 They recently did a great episode where they show you how you can integrate AI into the workplace.
0:16:05 Listen to Marketing Against the Grain wherever you get your podcasts.
0:16:13 I really like David. I just love the fact that like mid journey was completely bootstrapped. Like I
0:16:17 don’t think they’ve raised any money. And there’s a tweet that he tweeted a while back where he just
0:16:24 doesn’t seem that interested in the competition, which may come back to bite him now because I actually
0:16:30 do think that throughout the last six months, mid journey, their images do have a little bit more
0:16:35 energy and soul to them compared to a lot of the other image generators. And there are other good
0:16:40 image generators and Flux is good, but I’ve just respected their brand, I guess. And I think their
0:16:47 site is amazing. And I think that ideally they would add the open AI’s API into their site, but I don’t
0:16:50 know if they’re going to do that. I don’t know. I do have faith that they’re going to build some cool
0:16:54 3D stuff, but I also think open AI is going to do that. So I don’t know. Who knows?
0:16:58 You know, I agree. I think mid journey has like a specific style. You can see images and go,
0:17:04 okay, this is mid journey style. I find myself using mid journey a lot less than I used to these
0:17:09 days. I think, you know, ideogram has caught up. I really liked the Leonardo Phoenix model,
0:17:15 but I also have equity in Leonardo. So there is that they’re good. Yeah. But, uh, you know,
0:17:20 I still pay the subscription, but I just find myself using it less and less. And now I really like
0:17:26 chat GPT in this new model or the ability to edit, like to, to take existing photos and then have it add
0:17:31 the text or change the style. Or, you know, I did make some images in the studio Ghibli style and was
0:17:36 posting them because it’s just fun, but it’s, we’re hitting saturation point on studio Ghibli right
0:17:41 now, but it’s still really, really, really fun. But I agree. I still have a lot of respect for
0:17:45 mid journey. I think they have some big things in the works, but I also feel like they’ve been saying
0:17:51 V seven is coming out in two weeks for the last six months. I’ve been skeptical for a long time.
0:17:54 Cause I’ve always thought that mid journey, like I said, it doesn’t really understand what’s going on your
0:17:58 image or it doesn’t seem to, especially if you try to edit, you can see that really quickly. Like,
0:18:02 Oh, it does not understand what’s going on. It doesn’t exactly know how it made the beautiful
0:18:06 thing in the first place. Uh, so it has a hard time of editing that beautiful thing. And, and like,
0:18:12 it seems like with, you know, open AI and the resources they have, like their LLMs are going to
0:18:15 be able to understand images in it, which is now we’re seeing that they’re actually understanding
0:18:19 what’s in the thing. And it feels like fundamentally different technology.
0:18:24 Yeah. I think the biggest takeaway from this is that it’s a commodity AI generated images,
0:18:30 even Imad Mostak, the original founder of stable diffusion. I don’t know if he was a founder,
0:18:35 but I really liked him. He, he was like AI images have been solved by this model because of its command
0:18:40 over the pixels. It knows exactly what to do. And I think because of that, and you think of like
0:18:46 AI generated videos or just a bunch of images string together, you know, 30 images strung together in a
0:18:52 second, we’re going to see that level of command. And, you know, I don’t see a reason why we won’t
0:18:58 be able to create studio Ghibli animations at an incredibly high level by open AI. If it has that
0:19:04 level of the command and it can just generate all 30 pixels for that frame, especially at the quality.
0:19:09 Like if you look at some of the 3d, like if you create like 3d characters and you zoom in,
0:19:13 it just understands physics insanely well. It’s like scary, honestly.
0:19:16 Yeah. I mean, you take some of the studio Ghibli images that generates or some of the like
0:19:20 voxelized images that I was generating. And you right now, if you throw them into something like
0:19:26 Runway’s dream machine, it’ll actually animate them and make it look barely close to like a studio
0:19:31 Ghibli animation. So I imagine it’s only a matter of time before you can just prompt that straight
0:19:33 in chat GPT without having to use multiple tools.
0:19:38 It’s so smart. I literally handed 2.5 pro my game docs explain like the aesthetic of my game
0:19:44 and the story and everything like that. And it generated just like shocking art that like got so
0:19:48 many details. I tried to get mid journey to do this. Like it made it beautiful, but it just missed all
0:19:53 the details. And this like gets the details of like my story and like integrates those into the art.
0:19:54 It just blew my mind.
0:19:59 Yeah. I want to jump back real quick to like the Gemini 2.5, because like we mentioned,
0:20:04 that’s sort of big news, but obviously this AI image generation from open AI, you know, sort of
0:20:09 overshadow it because, you know, that’s sort of open AI’s MO is like to try to overshadow Google
0:20:16 whenever they can. But the Gemini 2.5 pro that model, it actually has a million token context window
0:20:22 and it is insanely fast. I actually took a transcript from a 30 minute video and was having it helped me
0:20:27 with ideas for titles for the video. And it took this transcript. I don’t know exactly how many words it
0:20:33 was, but it was, you know, tens of thousands of words. I plugged it in there and like within three
0:20:38 seconds, it had like 10 title ideas. It didn’t even seem like it took the time to read it. It’s just
0:20:45 that fast. And then if you’re talking about things like vibe coding with a million token context window,
0:20:51 if you have a fairly decent size app, you can actually copy and paste your entire code base in
0:20:57 there and let Gemini 2.5 read that entire code base and sort of find bugs and find redundancies and ask
0:21:02 it to like refactor the code for you and things like that. And it’ll actually do that. When I try to do
0:21:08 it with open AI’s O1 pro, I use something like repo prompt, but on a PC repo prompt wasn’t available.
0:21:14 So I was using one called repo mix. And with repo mix, I was copying my entire code base, but O1 pro was
0:21:19 saying there’s too much text here for us to read it. Like it was already over the limit, but now
0:21:25 with Gemini 2.5, you can actually throw the entire code base in there and actually get it to read the
0:21:30 whole thing for you, which to me is wild. But like you Riley, I actually haven’t spent a lot of time
0:21:38 using 2.5 pro with coding yet. Where are people using it? Like, is it just in the Gemini studio or?
0:21:43 Yeah. It’s in their AI studio. So I think it’s AI studio.google.com is where you can use it.
0:21:46 That’s why I asked Logan last time we hit him on. I was like, why don’t you guys have it like a,
0:21:50 like a, an actual property? Like you’re like hiding it behind the AI studio thing. And like,
0:21:54 I get it. They’re like targeting a different audience, but it’s like, it feels like it could
0:21:57 be great if they like made a great consumer product with that in it. But yeah, you can see here,
0:22:05 we’ve got the token count here and it’s got zero of 1,048,576 tokens on the screen.
0:22:10 And so, I mean, that context window is really, really what makes the difference here.
0:22:14 Yeah. And they said 2.5 pro is going to get 2 million soon, which is nuts.
0:22:19 That’s insane. Yeah. Yeah. And so, I mean, I would imagine by like the end of the week,
0:22:26 probably by the time this episode is live, you’ll probably see Gemini 2.5 pro in windsurf and cursor,
0:22:31 if I had to guess, because those guys, whenever these new models come out and the API is available,
0:22:35 they get them in there quickly. I don’t know the cost though. So I, you know, that’d be the one sort
0:22:41 of limiting factor, I guess. Yeah, totally. I do notice that Anthropic, for whatever reason,
0:22:45 their models have seemed to be better in cursor and windsurf, whether or not the code is technically
0:22:50 good enough. It just kind of understands their whole tool system better. I don’t know. I don’t
0:22:57 know the science of why that is, but I remember testing Gemini in cursor. And even though the previous
0:23:02 version of Gemini is good when I copy and paste code over, when I use it in cursor doesn’t quite
0:23:06 understand the code base as well. Yeah. I did notice that. I’ve noticed the same thing when I was trying
0:23:12 to use like a one, not a one pro, but just the standard. Oh one. I never got results that were
0:23:20 nearly as good as what Claude 3.5 or 3.7 would do for me. Yeah, I agree. Yes, totally. Should we try to
0:23:24 build something real quick? I’m kind of curious to see how some of your workflows have changed since the
0:23:29 last time we did a video together. So I think it’d be kind of cool to try to build something
0:23:33 simple. Okay. What do you think, Nathan? Like, I’m trying to think of something where we could use
0:23:39 for, even if it was just like asset generation or something for even like a website or because when
0:23:42 I saw that recently, I was like, oh, that is going to be a huge, like I’ve said, like websites have
0:23:46 been so boring for so long. Everyone just copies everyone. You know, there’s like one or two big
0:23:51 websites, like linear and a few others that all like every new SaaS app, they all are like linear.
0:23:57 Yeah. Yeah. Well then you have like Vercel’s V0, right? Which is literally designed to paste a URL
0:24:02 in there and then have it clone a site for you. Yeah. So you’ll be able to pick what style. You’ll pick
0:24:06 like, you know, make it in a Minecraft style or whatever, which is going to be nuts. Yeah.
0:24:13 So another update that was just released actually, that we should talk about is V0 released some new
0:24:18 features. Oh, cool. I wasn’t even aware. That is part of their new features. So that’s a good idea.
0:24:24 Yeah. So let’s go to chat GBT, right? And let’s start an agency. Us three, we’re starting an agency.
0:24:32 Okay. And what do we do? Um, we’re sell, um, you know, we sell our AI services. We’re AI experts and we can
0:24:40 say, come up with a logo for my company where we are consultants. We’re really going to do this,
0:24:47 right? This is a good idea. This is a real business being formed right now on this episode.
0:24:53 It’s a real business. Okay. So the reason I’m doing this is I want to talk about V0’s latest feature
0:24:59 where you can just paste in images directly into the chat and say, use this image in the app. And that
0:25:06 wasn’t a thing before. And to my knowledge, like lovable doesn’t do that either. Bolt might do something
0:25:09 like it. I think in Bolt, you can actually just go into the code files and put it in the public folder.
0:25:23 Please make it a bunny and don’t include the text. Uh, make it in this style of, I guess we could use
0:25:28 the style of the rabbit. I mean, I do like the slogan. Yeah. Yeah. I think the slogan’s good, but
0:25:34 it’s too, uh, corporate for me, uh, personally. Yeah. It also looks very generic. Whenever you
0:25:38 ask an AI to generate an image of what AI looks like, it always does this sort of brain with like
0:25:44 neural network kind of imagery. Yeah. It’s in the training data and you will never get it out. Okay.
0:25:53 This is good. I like this already. Completely disregard the styling before I want a cute rabbit
0:26:02 mascot, Pixar style, simple, please like a logo. Okay. So what you’re saying is we can actually make a
0:26:09 logo and then give V0 the logo and it’ll sort of design the site around that logo. Yeah. So let’s say
0:26:24 we want to build a landing page for my agency that helps people with AI and come up with good copy
0:26:30 writing. I bet we could even generate like mockups with 4.0 and then hand that off as well. Like we
0:26:35 might even get something more beautiful doing that. Yeah. Okay. So here’s 4.0. And so what we could do
0:26:41 here. And so if you download the image, right, all you need to do is just drag in the image here and
0:26:46 you can do as many images as you want. Let’s say you have a portfolio of all a bunch of projects or
0:26:50 you’re a designer, you could upload all of your designs and you could say, build a landing page
0:26:55 for this. My name is Riley Brown, put it at the top. And my number is this call me if you need to
0:27:02 and come up with copywriting and convert people to fill out a form for email.
0:27:07 So is this your normal flow? If you’re building a new app, do you sort of design it out in V0 first
0:27:14 and then pull it into cursor? I do that sometimes, especially with landing pages. Please use this
0:27:21 attached image as a logo design. We’ll see what it does. And then we can always ask it to edit it.
0:27:26 I think when I’m doing landing pages, yes, I just use V0 and then V0 has this feature,
0:27:31 which is download zip. That’s how I do it. I just download the zip and then open it in cursor.
0:27:36 And you just open that project after you unzip it. And then you can just immediately start editing it.
0:27:41 And it’s just a really fast way to like concept things. And then now what this feature does
0:27:46 basically is when you press download the zip, this image that you just paste into the chat will
0:27:51 actually be in the file. So you can open it up, straighten a cursor. And then cursor does a lot
0:27:56 better, in my opinion, after prompt five, you know, once you start getting deeper into the project.
0:28:00 Gotcha. Yeah. That’s been a similar workflow. I like sort of getting the design out of V0 for
0:28:05 whatever reason, V0 seems to be the best right now as sort of coming up with a design that looks nice
0:28:09 and clean. And then you grab the zip. And then what I’ll typically do is throw it into cursor.
0:28:13 And then I’ll say, read this code and make sure you install all the dependencies to be able to run
0:28:16 this code. Cause a lot of times you’ll start running into errors and it’ll be like, Oh,
0:28:20 you need to install this thing. And then you need this thing installed on your computer. And a lot
0:28:24 of times like the versions of react or whatever you’re using aren’t installed yet. So I just make
0:28:27 sure it installs all the dependencies and then we’re off to the races.
0:28:33 Totally. It’s a lot of fun. See, okay, there you go. You see it, it put it in the page. Obviously I
0:28:38 would probably go to Canva and use their quick, you know, background remover feature before you did
0:28:43 that. So it kind of hovers over the site. You can do that in chat GPT now too. Well, 4.0, you can throw
0:28:47 the image in 4.0 and say, remove the background and it’ll do it there too. Now. Wait, will it actually
0:28:52 be like a PNG? It’ll be a clear background or will it be a white background? It’ll be a clear
0:28:57 background. It’ll be a transparent PNG. Yeah, it’s good. I did not know that. Yeah. So
0:29:01 you should be able to just give it a prompt, like use the same image and remove the background
0:29:07 or make it a transparent PNG. That’s crazy. I did not know that. Yeah. Wow. I’m telling you,
0:29:12 these design tools are trouble. That is such an ideal workflow is to just be able to just say,
0:29:17 get rid of the background, add text on it. Like that is the future of design in my opinion,
0:29:24 which is why I’m so excited about like maybe 4.0’s getting the API or getting the MCP inside of cursor.
0:29:27 And then it’ll just do it straight from your IDE. Yeah.
0:29:34 Yes. That is huge. That is crazy actually. Cause now what we can do, notice how we already have
0:29:42 this image in here. We can say, actually replace the image that we just added with the one in this
0:29:47 message. I don’t know if I worded that right. It’ll figure it out, but now it’ll be like
0:29:52 hovering over the app. Yeah. It matched the color style of the logo and all that kind of stuff.
0:29:57 There we go. It’s already done. And it did a quick edit too. So now it’s hovering over it.
0:30:06 And like, what we can do is we can say, make the rabbit animated, have it, uh, pull. I forget the name
0:30:10 of the animation and make it look fun. I don’t know.
0:30:15 Sometimes it’s fun to just let AI get creative with itself and see what it comes up with.
0:30:19 I mean, that’s vibe coding. I mean, sometimes you just gotta let the AI do its thing. Cause
0:30:25 I don’t know, it’s probably a better designer than me, but yeah, I think that’s cool. And V0 just
0:30:30 recently added a database feature. I haven’t tried this yet. I don’t know if it’s through Supabase.
0:30:36 Um, and so I just know that they have a database feature. V0 is basically trying to become like
0:30:42 full where it’s like, you can build a full app that uses AI tools that like uses APIs that has
0:30:47 full files that you can add and a database. And so we’re definitely seeing a lot of competition in
0:30:48 this space for sure.
0:30:53 Can you actually import something you already built into V0 and iterate off of it? Or is it sort
0:30:56 of designed to start from scratch with V0?
0:31:04 Hmm. I’ve never imported anything because it’s made with shad CN. And so that’s why it has like
0:31:11 relatively similar components when you make it. And I don’t think you can import anything that’s not in
0:31:16 that framework. And so I think for that reason, they don’t allow it. Wait. Oh, you can see it’s like, uh,
0:31:17 animating a little bit. Can you see?
0:31:22 Yeah. It actually added a shadow below it too. And the shadow is sort of animating with it. I mean,
0:31:23 it’s subtle, but it’s there.
0:31:30 I like that. That’s cool. And yeah. And you can just vibe code. And they also actually made like
0:31:31 these little shapes that are animated.
0:31:32 Oh yeah.
0:31:37 That’s kind of cool. I didn’t even ask for that, but you know, I said, and make it fun. And there you go.
0:31:38 There’s the fun.
0:31:39 Yeah. There’s the fun.
0:31:43 Yeah. I was just thinking about, I’ve known Guillermo who created Vercel for a long time,
0:31:48 like back in the early, like node.js days. And, and back in the day, like Next.js was like the easiest way
0:31:52 to create like a very simple, beautiful website, like very simplistic, though,
0:31:56 minimalistic. And, uh, and this is like the natural evolution of that. Like now when you want to start
0:32:02 something new, instead of just creating a basic bland Next.js website, you go in here with your
0:32:06 V0 and you can create whatever style you want versus just having to accept their style.
0:32:07 Totally. Yeah.
0:32:12 Let’s like make it. So if you press get started, it opens like a modal box that you can, you know,
0:32:17 put contact details in or something. It doesn’t have to actually like submit the contact details
0:32:18 anywhere. Let’s just get the design working.
0:32:22 I want to test this now with doing 4.0 to actually generate the different elements of the website and
0:32:27 then paste that in and see, like, you know, like here’s a testimonial section, like make that
0:32:29 really cool. And then like, Oh, let’s make this section.
0:32:35 Oh, just design a whole website layout for you. And then just pull in the image and say,
0:32:37 make this website from this design.
0:32:41 Yeah. I bet there’s some still some limitations there if I had to guess, but it probably can do a lot.
0:32:48 Yeah. So if we hit get started, let’s have this character animate into a different position. I’ve
0:32:55 tried this before. Let’s see if this works. So I just said to chat GPT, this is why it’s so powerful
0:32:59 for designs. Like you can do this with buttons. You can do it with basically any component on your
0:33:04 app is you can get it to like slightly change. And like when you do something on the site,
0:33:06 I have an idea. It’ll make sense in a second.
0:33:09 And it’s automatically giving it the transparent background too.
0:33:16 Amazing. Okay. So what happened? Please have a third button to the right of learn more that says
0:33:24 not interested. If the user presses not interested, the rabbit will change to the image that I just
0:33:34 uploaded and then grow three X the size and attack the user. And it will do a shake animation and then
0:33:39 have big text pop up below it saying, no, you have to press get started. I think about the bunny
0:33:44 python, the killer bunny. I don’t know. We’re just having fun. So was that whisper flow that you were
0:33:50 using for that? Yeah, that’s whisper flow. I’ve been using this since she’s right. When I started
0:33:54 vibe coding, I actually know the founders. Oh, nice. Yeah, it’s a great tool. That’s what you use
0:33:58 too, right, Nathan? Yeah, yeah. Riley, after we talked, I injured my hand and then I ended up having
0:34:02 to use it. Like I kept saying, oh, I’m going to use it. That’s really cool what Riley’s using. And then I
0:34:07 injured my hand. I’m like, I can still get a lot of work done by using whisper flow. It’s been amazing.
0:34:14 Oh, yeah, it’s great. I hardly type anymore. And it’s spelled whisper without the H or the E. So I always
0:34:19 got that wrong. So if you guys are looking for it, it’s very easy. I love it because it’s hidden.
0:34:25 It’s like a very subtle tool. It doesn’t try and be more than what it’s made for. Yeah. So you just
0:34:30 have like a hotkey set up on your keyboard, you press it and you can talk? It’s the bottom left button on
0:34:35 Mac. From a user interface standpoint, it’s just this little thing right here. When you release, it just
0:34:41 takes what you said very quickly. And it will like try and predict what you say. Like if you say
0:34:47 something and then you’re like, wait, actually, I mean this, it’ll just do what sentence it thinks
0:34:51 you mean. Oh, cool. Yeah. It’s really amazing for people who have an injury because like I mapped it
0:34:55 to a actually have like an MMO mouse with like all the buttons on the side. And I mapped different
0:34:59 things, including whisper to one of the buttons on there. So I was using one hand. I still could get
0:35:03 work done. So like I’m using one hand. I just press the hotkey on the mouse button and then just start
0:35:07 talking. It’s like I could still get everything done. A lot of things done with one hand.
0:35:11 That’s awesome. Yeah. Yeah. Love it. All right. Wait, where were we with it? Oh, yeah. Not interested.
0:35:17 So if we press this, what the? Wait, wait, wait. But here, here we go. Not interested.
0:35:22 That’s actually kind of fire. I like that. See, the web’s getting more fun. I told you guys,
0:35:27 it’s getting more fun. Yeah. So if you’re listening, it zooms in on the bunny. The bunny gets pissed off
0:35:34 and starts like shaking. It won’t let me out. My computer’s hacked. It’s over. The bunny’s got it.
0:35:38 Okay. There we go. Now we take all their money and it’s a great business. That is really cool. The way it
0:35:43 like animates in, that looks kind of natural. There’s something here. That’s fun. Anyway.
0:35:48 The rabbit looks so cute until you’re not interested. Do that like a pricing page, right? Like somebody’s
0:35:53 like looking at the lower tier and like the bunny’s kind of sad now. It’s like, can’t you spend some
0:36:00 more? When you cancel your subscription on Duolingo, the green owl like pulls it, threatens you or
0:36:05 something. Yeah. But like even something as simple as this, like this used to take a while for someone to
0:36:09 like design a character and create an animation. Like, look at us. We’re doing it for no, literally
0:36:14 no reason. Like, we’re just like, sounds kind of fun to try to make the bunny.
0:36:18 Out there would take a team, right? Like now a person with an idea can just do it themselves,
0:36:22 which is wild. If you think with 4.0, all the things you’re gonna be able to do with that,
0:36:26 it’s just nuts. I saw this tweet from Balaji earlier where he was talking about all the different
0:36:30 things that are gonna be impacted by this. You know, if you think about advertising, like what does this do
0:36:35 to advertising now that you literally can just, you can make an ad from scratch for anything you
0:36:40 want, you know? Yeah. And then now you can create a landing page with V0. It’s like one person in one
0:36:43 day could create like a hundred landing pages with like a hundred different ads and then have AI help
0:36:47 set those up probably in the near future. That used to take an entire team. That could be like 10 people,
0:36:52 20 people working on that. Yeah. I mean, in the same way that I was making thumbnails earlier,
0:36:56 if you wanted, you could go find ads, right? Like in the marketing world, they, you know,
0:37:00 they have a thing called a swipe file, right? Where you keep it, like you save images that you
0:37:04 find around the internet that maybe you want to use again later. You can have a swipe file of ads that
0:37:11 you really like, throw them into GPT 4.0 and then say, make an ad like this, but use my logo and change
0:37:16 the text to this. And it’ll make the ad, but with your logo and the new text that you want, but in that
0:37:21 same design, it’s just wild what you can do now. I saw one example where somebody took a product photo
0:37:25 of theirs and they said, make me an ad. Like imagine the people from Madmin were making an
0:37:30 advertisement for this and make the ad and it looked beautiful. It was like a really big thing.
0:37:35 And like, you know, the huge text, you know, the huge font and just, oh, it was just long text
0:37:39 underneath. And it was just, it was beautiful. Super cool. Well, is there any other ground that
0:37:43 we want to cover in this episode? I mean, we’ve sort of talked about all the different models that
0:37:47 have come out since the last time we’ve chatted. We’ve talked about the updates to V0. We’ve talked
0:37:52 about cursor versus windsurf. We’ve talked about MCPs. I feel like we covered like a lot of ground,
0:37:55 but is there anything that we haven’t touched on that we probably should?
0:38:00 Hmm. Honestly, the only thing that I think should be said at this point is just to like,
0:38:06 get your hands dirty and try this stuff because you can only spend so much time comparing the top models
0:38:12 and the top tools. Your ratio of consuming, you know, content on these tools and actually using them,
0:38:16 you know, you should spend more time using these tools because that’s actually how you learn. And
0:38:20 that’s how you actually build like a felt sense for it. And I’ve been vibe coding with a purpose.
0:38:25 I’m actually like building an app right now that has been like really, really, really fun to use.
0:38:31 And for me, I’ve cared less about the tools and more about the just like kind of just celebrating the
0:38:38 fact that we can vibe code. You can just concept things and you can share ideas for apps or for
0:38:42 specific features or designs incredibly fast if you just know how to use them.
0:38:49 And so I would just work on just as soon as you have an idea, make it real and practice that over
0:38:53 and over again, because eventually you’ll find an idea you want to spend, you know, the next decade on.
0:38:55 I truly believe that. So that’s kind of my thoughts on it.
0:38:59 Yeah. You know, one of the things that I’ve found really helpful, and I think we might’ve talked about
0:39:05 this the last time we chatted is that it’s cool to just make little quick apps that solve workflow
0:39:10 problems that you have. Right. If you find yourself doing something repeatedly, you can probably create
0:39:14 an app that can automate that thing that you find yourself doing repeatedly. And you could probably do
0:39:21 it in less than an hour. Right. It’s just, it’s crazy that you can just, you know, have AI solve these
0:39:28 problems by creating little apps for you. I think one piece of advice that I would give is also learn
0:39:35 about, you know, committing and pushing to get hub as well, because one of the things that I’ve run into
0:39:41 in the past is while coding, I will have it make like a small change, but it’ll break something completely
0:39:46 somewhere else on the website or on the app I’m making. And then I can’t actually get it to restore
0:39:52 the previously working version. Well, if you know how to use get hub, that kind of solves it. I know
0:39:57 cursor actually has a feature built in where you can sort of roll it back, but I’ve noticed sometimes that
0:40:03 doesn’t always work. I’ll try to like roll it back and the rolled back version doesn’t work.
0:40:09 So I found that like having it restore it from like a previous commit on get hub has been still the
0:40:13 best solution just to make sure you’re sort of saving as you go.
0:40:19 Yep. And it’s as simple as creating an account on GitHub, creating a new repo and then pasting
0:40:25 the repo into cursor saying commit this and cursor will commit it directly. All you have to do is ask.
0:40:29 And so it’s not as hard as you think if you’re worried about it. I was scared of GitHub to start
0:40:32 because GitHub sounds scary. I didn’t know you can do that. I’m still manually doing it. I didn’t know you
0:40:37 could do this. Yeah. It’s great. Oh, oh, just ask. Just ask it to create a repo. It’ll do it.
0:40:42 Well, there’s even a GitHub MCP now, too. So it’ll it’s even easier. It’ll sort of, you know,
0:40:48 check directly with GitHub to see what changes have been made instead of having to run a bunch
0:40:53 of get commands and pull information back. It will just look directly using the MCP. So it speeds it up
0:40:59 slightly. But yeah, you can directly connect now. It’s amazing. So people have no excuses now,
0:41:02 like they want to build something. They’ve always had that idea and they’re like, oh,
0:41:05 I could build that. But I don’t I don’t I can’t hire all the engineers or the designers.
0:41:10 You don’t have that excuse anymore. Like you can do it. Become the designer. Yeah. Come the end.
0:41:16 Yep. Yep. I mean, I’ve always sucked at design, too. And like some of these tools,
0:41:19 they’ll make designs. And I’m like, that looks good. I could have never thought of that myself.
0:41:23 Yeah. Like angry bunny here. If you’re a decent designer, too, they make you better. I’ve always been
0:41:26 around a bunch of amazing designers and, you know, kind of through osmosis became a decent
0:41:31 designer myself from that. But then like these tools will make you better as well, which is crazy.
0:41:36 Mm hmm. Well, cool. So what sort of apps are available? I know last time we chatted,
0:41:40 you were working on Yap Thread. Is that still out there available? Is that one of the ones you’re
0:41:46 still pursuing? Yeah, Yap Thread. The company that I started where we decided we were going to build
0:41:53 some apps, we found another thing we wanted to work on. And I’m very glad we did. It is tailored
0:42:00 specifically for people interested in vibe coding. We haven’t announced it publicly yet, but I’m sure
0:42:04 you guys have seen videos of it on Twitter. That’s all I can say. Okay, cool.
0:42:10 Cool. Well, once it’s finally launched, maybe we’ll have to have you back and you can give us a tour
0:42:15 of it. Cool. Will do. Awesome. Well, where should people go check you out? I know you’re on Twitter
0:42:18 and Instagram and all the places. Where’s the best place for people to go follow you?
0:42:24 There is no best place. I don’t know. I Twitter. Twitter is great. I’m just making content at this
0:42:29 point on the things I like. Every platform is different. So it depends if you want a little bit of me,
0:42:34 a little more, a lot. You can go to YouTube and watch my longer videos, all platforms. Like I talk
0:42:39 about different things. So, yeah. Awesome. Well, thanks so much for joining us and demoing this stuff
0:42:43 and sort of nerding out about vibe coding with us. Really, really appreciate you hanging out with us
0:42:57 today. Yeah, this is a great time. Love it.
0:43:01 Bye.

Episode 52: How has the landscape of AI coding transformed in just a few months? Matt Wolfe (https://x.com/mreflow) and Nathan Lands (https://x.com/NathanLands) are back with Riley Brown (https://x.com/rileybrown_ai), a leading figure in the vibe coding movement. Riley is known for his innovative approach to coding using AI, which has captivated and empowered developers worldwide.

In this episode, the trio delves into the rapid advancements in AI tools like Cursor and Windsurf, sharing insights on how these updates have revolutionized the coding experience. They tackle the developments in AI models, the introduction of MCPs (Model Context Protocols), and how these innovations are shaping the future of web development. The episode wraps up with a fun and insightful vibe coding session, creating a unique web experience to demonstrate the power and potential of AI in real-time.

Check out The Next Wave YouTube Channel if you want to see Matt and Nathan on screen: https://lnk.to/thenextwavepd

Show Notes:

  • (00:00) AI Tools: Cursor and Windsurf
  • (03:30) AI Debugging and Development Tips
  • (07:26) OpenAI’s MCP Integration Excitement
  • (12:11) YouTube Thumbnail Modifications
  • (14:51) Admiration for Midjourney’s Unique Approach
  • (19:19) Efficient Code Analysis with Gemini
  • (21:21) Anthropic Models Outperform in Cursor
  • (23:30) V0’s New Image Paste Feature
  • (26:40) Streamlining Design to Code Workflow
  • (30:23) V0: Beyond Minimalistic Web Design
  • (35:33) Ad Creation with GPT-4 Swipe Files
  • (38:10) GitHub: Best for Code Restoration
  • (40:24) Vibe Coding App Teaser

Mentions:

Check Out Matt’s Stuff:

• Future Tools – https://futuretools.beehiiv.com/

• Blog – https://www.mattwolfe.com/

• YouTube- https://www.youtube.com/@mreflow

Check Out Nathan’s Stuff:

The Next Wave is a HubSpot Original Podcast // Brought to you by Hubspot Media // Production by Darren Clarke // Editing by Ezra Bakker Trupiano

Leave a Comment