AI transcript
0:00:10 All of the aspects of our lives are going to be intermediated by the models, and we’re going to pay for that.
0:00:17 The definition of what a companion is has evolved so quickly from either a friend or a girlfriend to like anything,
0:00:23 any sort of advice or wisdom or entertainment or counsel you could have gotten from a human before.
0:00:27 Maybe you just need to feel connected to something. It doesn’t need to be human.
0:00:33 I think it’s been a puzzle to me what the first AI social network is going to look like.
0:00:37 To work, a social network has to have like real emotional stakes.
0:00:44 We’re living in this early era of AI where velocity is the most.
0:00:48 For decades, consumer tech followed a similar beat.
0:00:51 New platforms, new behaviors, new breakouts.
0:00:54 Facebook, Twitter, Instagram, Snap, TikTok.
0:00:56 But lately, that rhythm has changed.
0:00:59 And so has the nature of what we mean by consumer.
0:01:04 In this conversation, we bring together A16Z’s consumer and AI investing minds to ask,
0:01:06 what is the state of consumer in the AI era?
0:01:11 You’ll hear how creative tools like MidJourney and VO are reshaping expression,
0:01:13 how voice is becoming the new interface,
0:01:17 and how companions, AI ones, are filling in social white spaces.
0:01:20 We talk about retention and revenue curves,
0:01:22 defensibility beyond network effects,
0:01:24 and why velocity might be the new mode.
0:01:26 We also get speculative.
0:01:29 What happens when AI knows you better than your friends do?
0:01:31 What does the next social platform look like?
0:01:34 And are we heading towards a world where software, not shoes, not handbags,
0:01:36 is the new luxury good?
0:01:39 This episode is about new form factors, new business models,
0:01:41 and a new definition of connection.
0:01:42 Let’s get into it.
0:01:47 As a reminder, the content here is for informational purposes only.
0:01:50 Should not be taken as legal, business, tax, or investment advice,
0:01:53 or be used to evaluate any investment or security,
0:01:57 and is not directed at any investors or potential investors in any A16Z fund.
0:02:01 Please note that A16Z and its affiliates may also maintain investments
0:02:02 in the companies discussed in this podcast.
0:02:06 For more details, including a link to our investments,
0:02:10 please see A16Z.com forward slash disclosures.
0:02:17 It seems like every few years there was a breakout,
0:02:24 starting from Facebook, Twitter, Instagram, Snap, WhatsApp, Tinder, TikTok.
0:02:28 Every few years there was this sort of new paradigms, new breakout,
0:02:30 and it feels like at some point a few years ago it just stopped.
0:02:33 Why did it stop, or did it stop?
0:02:34 Would you reframe how we should think about that,
0:02:36 and where do we go from here?
0:02:40 I would argue probably ChachiBT was a huge consumer outcome
0:02:41 and winner in the past few years.
0:02:46 And we’ve also seen a bunch of other ones in various other modalities of AI,
0:02:50 in image and video and audio companies like MidJourney and Eleven Labs
0:02:54 and Blackforce Labs, now things like Kling and Veo.
0:02:57 Weirdly, though, a lot of them don’t have the same social
0:02:59 or traditional consumer dynamics that you mentioned.
0:03:04 I think because AI is still relatively early and so much of the new products
0:03:06 and innovation has been driven by research teams
0:03:09 who are like so good at training models,
0:03:13 but historically have not been amazing at creating the consumer product layer around them.
0:03:18 So I think the optimistic view is that the models are now mature enough
0:03:21 and many are available either open source or via API
0:03:25 for people to build great, more traditional consumer products on top of them.
0:03:30 It’s interesting that you asked that question because I was thinking about the past,
0:03:34 what, 15 years, 20 years, where, as you said, like Google, Facebook, Uber, all the names.
0:03:39 And it’s interesting because when you think about internet, mobile, cloud, everything together,
0:03:41 there were all these amazing names.
0:03:47 I think the cloud, mobile, all that had a lot of maturity baked in.
0:03:50 Like the platform was around for like 10, 15 years.
0:03:53 Every little nukes and crannies have been explored to some extent.
0:03:59 The changes that people had to adopt was Apple coming out with new features,
0:04:04 as opposed to changes that people need to adopt now is the underlying relentless model updates.
0:04:06 So I think that’s one different.
0:04:09 But the other thing is, again, Justin, you touched on this,
0:04:12 but if I think about the past historical winners,
0:04:15 there’s like information area like Google of the world.
0:04:17 And now I think ChatGPT is certainly doing that.
0:04:21 And they’re the utility we missed out like box and Dropbox of the world.
0:04:24 They’re more consumer, prosumery that people use,
0:04:28 where we also see a lot of the companies attracting and going after that use case.
0:04:30 Expression, creativity, same thing.
0:04:33 The creative tools are endless and that’s happening.
0:04:35 What I think is missing potentially is connection.
0:04:40 Like this social graph, this thing hasn’t rebuilt on AI yet.
0:04:46 And that may be just a white space or something that we just continue to see what develops there.
0:04:49 It’s interesting because Facebook’s almost 20 years ago at this point.
0:04:53 Like the companies that you mentioned, Justine, aside from ChatGPT and OpenAI,
0:04:55 are they going to be around 10, 20 years?
0:04:57 Like what is the defensibility of the companies we’re talking about?
0:04:59 And also the use cases of all the companies I mentioned,
0:05:02 are they going to be disrupted by these new players?
0:05:07 Or in 10 years from now, will they continue to be sort of the mainstream application for all those use cases that they serve?
0:05:15 I mean, you could argue that ChatGPT has got way higher business model quality than the analogous consumer companies from the last product cycles, right?
0:05:17 Their top SKU is $200 a month.
0:05:20 At the top Google, consumer SKU is $250 a month.
0:05:24 So sure, there’s a question of defensibility, networks, all these other things.
0:05:30 But that might have been a response to the poor business model quality that would have occurred if you didn’t have those things.
0:05:34 Now you can just charge people a lot of money and perhaps we’ve been overthinking it previously.
0:05:35 Yeah.
0:05:40 There was poor business model quality, maybe stronger sort of retention or product market fit or durability.
0:05:46 Yeah, like you had to have a story for how this was like compounding enterprise value in the absence of just making money right away.
0:05:49 And now these models and these companies are just making money right away.
0:05:50 Yeah.
0:05:55 I think the other thing is, Justine, you talked about this, like all the foundation models are kind of pointy in different ways.
0:06:01 So you could say, look, Claude and the ChatGPT horizontal model and the Gemini model, aren’t they interchangeable?
0:06:02 And doesn’t that mean price pressure?
0:06:05 But different people use them for different things.
0:06:07 And it seems like they’re raising prices, not lowering them.
0:06:13 So I think when you like zoom in a little closer, you see that there are like some interesting defensibility dynamics that are already there.
0:06:22 Increasing price, not decreasing in an interesting point, because monetization is clearly a different thing from previous era to AI era, especially for consumer companies.
0:06:24 They’re making money right away.
0:06:34 One thing that’s always on my mind, and Olivia, tell me if you think that’s not correct, but like the retention, when we talked about retention on the consumer subscription model before AI,
0:06:43 I don’t know if we actually try to make a differentiation between unique user retention and revenue retention, because they’re like kind of the same.
0:06:45 Like you don’t get to change pricing that often.
0:06:46 You don’t get to upgrade.
0:06:47 Like it’s the same thing.
0:06:56 As opposed to now, we make a very clear differentiation between unique user retention and revenue retention, because people actually upgrade.
0:07:03 They actually have all these like credits and points they need to actually overages that they actually end up spending.
0:07:11 So you actually see revenue retention being meaningfully higher than unique user retention, which again, like I haven’t seen that before.
0:07:11 Yeah.
0:07:11 Yeah.
0:07:17 I think before the average consumer subscription was maybe $50 a year, if that.
0:07:18 That was kind of a lot.
0:07:21 Like the best in class consumer products would charge that.
0:07:24 And now we have people very happily paying $200 a month.
0:07:25 Wow.
0:07:29 And even saying in some cases that they feel like they’re being undercharged for that or they would pay more.
0:07:30 How do we explain that?
0:07:32 What value are they getting such that they’re paying more?
0:07:34 I think it’s doing work for them.
0:07:40 Like consumer subscriptions in the past were on things like, I don’t know, personal finance, fitness, wellness, like things that.
0:07:41 Entertainment.
0:07:41 Yeah.
0:07:45 But they were things that ostensibly would help you help yourself, entertain yourself.
0:07:48 But you would have to invest a lot of time to get the value from them.
0:07:55 And now with products like Deep Research, for example, that could replace 10 hours of generating a market report by yourself.
0:08:00 And so that kind of thing is easily worth, I think, for many people $200 a month, even on one or two generations.
0:08:05 I mean, I think things, too, like VO3, like people are paying $250 a month.
0:08:10 Unfortunately, I have since – it’s a limited, the $250 credit plan.
0:08:11 You’re the whale, Justine.
0:08:13 You’re the $1,000 a month.
0:08:16 I’ve since charged several more of the $50 credit packs.
0:08:20 So I probably spent, if I had to guess, like over $500 in the first few weeks at VO3.
0:08:22 The revenue retention is higher.
0:08:22 Exactly.
0:08:30 And I’m happy to pay that because it’s like you have this suddenly – it feels like a magical mystery box that you can open in and get whatever video you want only for eight seconds.
0:08:32 But it’s incredible.
0:08:33 And the characters can talk.
0:08:41 And you can make amazing things that you can share with friends, make like personal memes of someone delivering a message to your friend with their name in it.
0:08:45 Create full stories that people are posting on Twitter and Reddit and all of these different places.
0:08:50 It’s sort of like nothing we’ve seen before in terms of what consumer products can actually do for people.
0:08:51 Yeah.
0:08:55 It seems like every part of consumer discretionary spend is going to be overtaken by software.
0:09:00 And I think in the future you’re going to see consumer spend to be like food rent software.
0:09:03 And that’s kind of where we’re going with Justine speaking to.
0:09:04 And can you give us some examples of that?
0:09:06 Well, a lot of it is what Olivia said, right?
0:09:09 So I think all the entertainment is being subsumed by it.
0:09:14 A lot of the sort of creative expression work that you would do outside of software is now being subsumed by it.
0:09:20 A lot of the sort of relationship intermediation, which might have been a place for disposable income spend, is being subsumed by it.
0:09:27 So all of the aspects of our lives are going to be intermediated by the models, and we’re going to pay for that.
0:09:35 Brian, you’re saying what we’re still missing is connection from this new paradigm, and people are still relying on sort of Instagram, Twitter, some of the other sort of social networks of the past.
0:09:38 What’s going to get us to something new here in the realm of connection?
0:09:40 Or has that just been one data network effect?
0:09:50 You know, it’s funny, when I think about social, which is a category that I get so excited about, at the end of the day, a lot of it was status update, right?
0:09:53 Facebook, Twitter, Snap, it’s just like, here’s what I’m doing.
0:09:57 And through status update, you feel connected to that person.
0:10:01 And that status update showed up in different modality.
0:10:10 It used to be, here’s what I am, here’s what I’m doing, to actual photos of where you are and what you’re doing, to videos and short-form videos now.
0:10:14 So now people feel connected to others through reels and what have you.
0:10:19 So I think that has been one era of feeling connected with others.
0:10:22 Now the question is, how can AI help that?
0:10:28 How can AI feel like you’re connected to other human beings and know what’s going on in your friend’s life?
0:10:36 The truth is, if I just think of a modality of photo, video, audio type things, I think a lot of it has been explored.
0:10:42 Different versions and mutations of that have been explored quite extensively, especially on mobile.
0:10:50 I think where we could get to is, it’s funny, I don’t know about you guys, but I pour my heart and soul into ChatGPT.
0:10:53 It knows more about me than probably Google, potentially.
0:10:56 Which is an insane thing to say.
0:11:03 Like Google, I’ve been using Google for a decade plus and ChatGPT may know more about me than Google because I type more, I tell it more, I give more context.
0:11:11 What might connection feel like when that essence of me is shareable with others?
0:11:23 And I don’t know if that’s the next version of feeling connected, but I can certainly see a world where that is, that resonates with a lot of folks nowadays, younger generation, et cetera, that are tired of just looking at the surface of all the stuff.
0:11:40 I mean, we already see some examples of exactly that, where like, there’s all these viral trends where people are like, I asked my ChatGPT based on everything you know about me, write my five strengths or weaknesses, or like make an image of who you think the essence of me is, or make a comic like about my life.
0:11:41 And people are sharing those everywhere.
0:11:49 I posted one the other day, and within minutes, I had dozens of people responding like with their own and sharing stuff, people I didn’t even know.
0:12:02 I think the interesting thing, though, is so far, the social behavior that has come from the AI creative tools largely, but also things like ChatGPT, is still happening on the existing social platforms and not in the new AI platforms.
0:12:06 Like, Facebook now is like, a lot of AI content.
0:12:09 Potentially unbeknownst to some of the audience.
0:12:15 Facebook is like the boomer AI swap, and then like, Reddit and Reels are like the younger people AI content.
0:12:15 Yeah.
0:12:17 I agree.
0:12:27 I think it’s been a puzzle to me what the first AI social network is going to look like, because we’ve seen attempts at, for example, like a feed of pictures of you that are AI generated.
0:12:34 And I think the problem there is that to work, a social network has to have, like, real emotional stakes.
0:12:44 And if you can generate the content in a way that you like it, and you always look amazing, and you always look happy, and you’re always in a cool background, like, it doesn’t have the same sense of stakes.
0:12:48 And so I don’t think we’ve seen the version of what a ground-up AI social network would be.
0:12:57 Like, you use the word skeuomorphic, a lot of the AI social products that mimics Instagram feed or Twitter feed was bots and AIs.
0:12:59 That feels skeuomorphic.
0:13:01 That feels like, this is what it used to look like.
0:13:02 We’re going to do it with AI.
0:13:05 And maybe that’s not really the form factor.
0:13:13 And, you know, there’s an additional hurdle in my mind that a true consumer product probably needs to leave it mobile.
0:13:23 And for AI products to work really, really well, I think there’s still a little bit of work where the cutting-edge models can do to live on edge, live on the device side of things to really enable that.
0:13:26 So I’m also excited to see what happens there.
0:13:30 It seems like people recommendation is the obvious use case at some point.
0:13:32 Like, who would be good for me to start a business with?
0:13:33 Who would be good for me to be friends with?
0:13:34 Who would be good for me to date?
0:13:37 These platforms get all this information about us.
0:13:37 Connect the dots.
0:13:50 I mean, I think an interesting area that’s maybe informed, like, where this all goes is if you look at the AI native LinkedIn efforts, the observation is that LinkedIn is a pointer to what you know instead of actually containing what you know.
0:13:58 And with this tech, we can create a profile that actually contains what you know, so I can talk to synthetic ET and get all of your wisdom.
0:14:01 Perhaps that’s what future social looks like as well.
0:14:03 That’s what you’re talking about, Justine, right?
0:14:09 If the models already know who you are, then is there, like, a synthetic you you can deploy in an interesting way to interact with people?
0:14:10 I don’t know.
0:14:20 One thing I heard you guys say is that when surprised that you guys sort of realized was that enterprises are sometimes adopting these products first before consumers, which feels different from previous era or maybe not what we expected.
0:14:21 What can we say there?
0:14:23 Yeah, that has been fascinating.
0:14:27 And BK and I saw that a lot with Eleven Labs, which we were relatively early.
0:14:31 I think we did the Series A a month or so after the initial launch.
0:14:37 And I think what we saw was first the early adopter consumers got on board and they were making memes.
0:14:39 They were making fun video and audio.
0:14:40 They were cloning their own voices.
0:14:41 They were doing game mods.
0:14:47 But then I would argue it hasn’t even gone in many cases to the true mainstream consumer.
0:14:53 Like, it’s not yet every single person in America or most have Eleven Labs on their phone or have a subscription.
0:15:03 But the company has these massive enterprise contracts and a ton of huge customers across, like, conversational AI, entertainment, tons of different use cases are using Eleven.
0:15:10 And I think we’ve seen this across a bunch of AI products, which is, like, there’s an initial consumer virality moment.
0:15:17 And then that actually leads to lead generation in enterprise sales in a way that we did not see with the last generation of products.
0:15:27 Like, enterprise buyers, there’s so much of a mandate to have AI now, an AI strategy and use AI tools, that they’re watching places like Twitter and Reddit and all of the AI newsletters.
0:15:38 And they’re saying, like, hey, this looks like a random consumer meme product, but I can actually think of a really cool application of that in my business and become the hero for having our AI strategy.
0:15:45 I’ve also heard of, like, similar in that vein, really exciting use cases of AI where you start with consumer virality.
0:15:50 So, you know, from a company side, you get all these Stripe payment data.
0:15:57 You look at all the Stripe cells and you basically put it in an AI tool to go try to find where they work.
0:16:06 And then when you find out more than X number of people working in that company, you reach out and say, hey, by the way, looks like 40 plus people are using our product.
0:16:07 What’s up?
0:16:13 I think the fact that they can do it with one person on an hour was, like, really what really struck me.
0:16:17 It was a chief of staff guy who was like, yeah, what I do is do this.
0:16:19 And, like, it all does it in, like, a couple minutes.
0:16:24 And then I send a mass email out and that’s, Jesus, okay, that’s like a go-to-market at extreme speed.
0:16:28 Justine, you rattled off a list of products and companies in the beginning of this conversation.
0:16:33 What I’m curious is, do you think just as examples, are they sort of the MySpace or Friendster?
0:16:37 Are we in sort of that era or are they the list of companies I rattled off that are still relevant 20 years later?
0:16:39 Like, where are we right now?
0:16:47 I mean, I think our hope always is that every big consumer AI company now that we see and love and use all of the products, which we all do, sticks around.
0:16:50 I think, unfortunately, that’s not always going to be the case.
0:17:02 I think maybe the interesting differentiation in AI versus the last era of consumer products or even two eras before is, like, the model layer and the capabilities are still improving.
0:17:07 Like, we have really not even, I think, in many cases scratched the surface of what these models can do.
0:17:13 I think we’ve seen that in things like the VO3 launch where it’s, like, you can suddenly have multiple characters talking.
0:17:14 You can have native audio.
0:17:16 You can do all of these things.
0:17:20 Like, all of these modalities, I don’t know, maybe we could argue about this with the techs people.
0:17:26 The LLMs are more mature, but have the opportunity to just keep improving capabilities as they scale.
0:17:43 And I think what we’ve seen is as long as a company stays at what we say is sort of, like, the technology or the quality frontier, so as long as they sort of have a state-of-the-art model or are integrating one or something like that, they won’t become, like, the MySpace or Friendster or whatever.
0:17:49 Like, they just keep – you fall a little bit behind, you ship the new update, suddenly you’re number one again, and you keep moving.
0:17:54 The interesting thing now, though, too, is we’re starting to see even segmentation in that.
0:17:58 So, like, in image, for example, there’s not just one best image model.
0:18:00 There’s, like, best image for designers.
0:18:02 There’s best image for photographers.
0:18:08 There’s best image for people who can only pay $10 a month versus the people who can pay $50 or $100 a month.
0:18:17 And so I think there can be – just because Ganesh mentioned people are spending so much, there can be multiple winners that persist over time as long as they keep shipping.
0:18:19 I absolutely agree.
0:18:22 I mean, even in video, it’s, like, a different video, but ad video.
0:18:28 And then even in ad video, I saw a post yesterday, I’m like, this is best for product shots and this is best with people.
0:18:32 And it goes on and on, and each of those, I think, is a very large market.
0:18:32 Yeah.
0:18:40 Say more about how – I know we talk a lot about defensibility and moats and how that has changed in this era, how we’ve changed how we consider that topic.
0:18:46 I’ve gone through a little bit of a come-to-Jesus woman on that, especially recently.
0:18:48 I think moats has always been very important, right?
0:18:53 The gold standard, this network effect, being part of the workflow, being system of record.
0:18:54 And these are all very, very important moats.
0:18:57 And I will posit that that’s still very important.
0:19:08 But funny enough, like, I would say the companies or investments that I’ve reviewed with this moat-first theory has not really been the winners.
0:19:18 And the winners in the category that we look at has always been the ones that break the mold, move really fast, have these incredible model launches, have these incredible product generation speeds.
0:19:28 And I’ve sort of come around that in that we’re living in this early era of AI where velocity is the moat.
0:19:36 And whether that’s in distribution, which is incredibly important and hard to break through noises these days, but also followed with product velocity.
0:19:49 That’s what wins the game because that’s what leads to mindshare and, frankly, right now, mindshare and users and traffic that actually converts to real revenue that gives you more ability to continue that journey.
0:20:00 It’s interesting, Ben Thompson, I think a decade ago at this point, had this blog post called Snapchat’s Gingerbread Strategy, where he was basically saying, hey, anything Snap can do, Facebook can do better.
0:20:04 But Snap is just going to keep sort of coming up with the next sort of innovation.
0:20:07 And if they can just keep doing that, maybe that’s their moat.
0:20:08 And he called it the gingerbread strategy.
0:20:10 It’s a good, it’s a good theory.
0:20:11 I think you just keep going to.
0:20:13 Keep adding candy to it?
0:20:13 Yeah, that’s funny.
0:20:17 And like, at some point, it’ll be such a beautiful gingerbread house.
0:20:20 You know, I think that worked to some extent.
0:20:20 Yeah.
0:20:24 And famously, I think Evan joked that he’s the chief product officer of that.
0:20:28 I think distribution and network effect ultimately kicks in, right?
0:20:36 And Snap has that too on its own, where it sort of has a corner of Gen Z and the younger users as like a core messaging platform.
0:20:38 How do we think about network effects with these new products?
0:20:44 I think it’ll, we’re not there yet where I think it’s because it’s mostly creation efforts right now.
0:20:50 There isn’t really a closed link, closed loop with creation, consumption, network effect, social network.
0:20:53 So I think we’re still a little early before a network effect kicks in.
0:21:00 But I think we see that in, we see a different type of moat form in the likes of Eleven Lab.
0:21:07 Like I said, because it moves so fast, because the product is very good, it gets to go into enterprise and gets to get locked in into the workflow.
0:21:13 So I think that version of moat we’re starting to see, I think the true network effect we’re still looking out for.
0:21:15 I think Eleven is an interesting example.
0:21:20 I was making an AI generated video the other day that I needed a voiceover for.
0:21:26 And Eleven now, because they had a head start, they had the best models, which then more people were using the product.
0:21:27 They can make the models better.
0:21:29 All of these compounding advantages.
0:21:34 They now have a library of people who have uploaded their own voices and their own characters.
0:21:44 And so for me, when I was looking across a bunch of voice providers, if I needed like a very specific, like old wizard mystical voice, like Eleven had 25 options for that fit what I need.
0:21:48 Where another platform might have, I don’t know, two or three.
0:21:52 And so I do think it’s early, but we’re starting to see signs of that.
0:21:56 But they’re more like traditional network effects that we saw with old marketplaces.
0:21:58 They’re not necessarily something like completely new.
0:22:02 I want to go deeper on voice as we talk about sort of new paradigms and form factors.
0:22:08 We got excited about voice pretty early on, or we’re the first friend that I saw sort of a thesis around it.
0:22:13 Anish, why don’t you talk about what got you so excited about voice in this new paradigm and what sort of played out and what hasn’t yet?
0:22:14 Where do you think is going?
0:22:18 I mean, the original observation, and Olivia really is our voice expert, so we should hear from her.
0:22:24 But the original observation that got us started was that voice has intermediated human interaction since the beginning of time.
0:22:28 And yet it’s been not a substrate on which technology has been applied because we just, the tech never worked.
0:22:34 And there’s all these previous efforts, voice XML and voice apps, and it just, it simply didn’t work.
0:22:36 The technology wasn’t ready yet.
0:22:40 And even then there was these pockets of Dragon Naturally Speaking and all these products from the 90s.
0:22:44 So there was always interest in voice, but it never made sense as a technology substrate.
0:22:48 And now with the generative models, you can just use voice as a primitive.
0:22:53 So it’s sort of unexplored, yet so critical to our day-to-day lives.
0:22:56 It feels like a perfect area where you’ll see a lot of AI-native efforts.
0:23:07 I think we first got excited about voice from more of a consumer perspective, like the idea of an always-on, like, coach or therapist or companion in your pocket that you can talk to.
0:23:09 And that has started to play out, I would say.
0:23:11 There’s lots of products where that’s working.
0:23:28 I think what surprised me, at least, is as the models got better, like, real enterprises have picked up voice so quickly to replace human beings on the phone or to augment what human beings are doing on the phone, even in really sensitive and critical categories like financial services.
0:23:36 Because previously they were using offshore call centers that also had lots of compliance issues and had 300 percent annual turnover and were really difficult to manage.
0:23:44 And so I think we’re still waiting to see in many ways what the first great, truly net new consumer voice experience will look like.
0:23:46 There’s some early examples.
0:23:51 I think people are pulling ChatGPT advanced voice mode into fascinating directions.
0:24:00 We’ve seen products like Granola blow up because they allow people to finally, for the first time, do something valuable with all of the things that they’re saying all day.
0:24:05 But the great thing about consumer is it’s completely unpredictable and the best products emerge out of nowhere.
0:24:07 Otherwise, they would have been built already.
0:24:11 So I’m excited to see what happens in consumer voice in the next year.
0:24:11 For sure.
0:24:16 I mean, it feels like voice is the AI insertion point for the enterprise period.
0:24:25 And I think the thing that everybody is missing right now is that the sort of mental model many folks have is that the low stakes conversations will be AI voice, the customer support, et cetera.
0:24:39 But what we’ve talked about is like the most important conversation that happens in a business in a given day, week, year is going to be intermediated by AI because AI will just do a better job with the negotiation or the sales pitch or the persuasion or the friendship.
0:24:47 What’s going to be sort of the first use case where people are going to be talking to synthetic versions of ourselves, like in a sort of consistent, relevant way?
0:24:51 Like why are they going to be talking to sort of AI Justine or AI Anish or AI Me?
0:24:52 We’ve seen a little bit of that.
0:25:02 There’s companies like Delphi that sort of create AI clones of people who have a big knowledge base that they can go and reference and you can get advice or get feedback or things like that.
0:25:04 And Brian sort of alluded to this earlier.
0:25:13 There’s this really interesting question of what if you allow not just like thought leaders or experts to have this AI clone that you can talk to via text, voice, maybe even video one day.
0:25:32 One of the things we think a lot about in consumer is there’s a lot of people who basically have had some sort of skill or insight or knowledge, whether it’s your friend from high school that’s like insanely funny and you always thought they should have a comedy cooking show, but they just never were able to break through or get it.
0:25:44 Or your guidance counselor who had incredible advice, like how can we enable those people to essentially scale themselves in a way that they never could before having an AI clone or an AI persona?
0:25:54 What we’ve seen thus far is a lot of that has been either thought leaders or experts or on the other, like total other end of the spectrum, like characters that people already know and like.
0:26:07 We saw early versions of that with character AI, which added a voice mode where there’s this pull, especially when you’re trying out a new technology to have some sort of familiarity of I’m talking to this character from my favorite anime series that I already know and love.
0:26:16 But I think we’ll start filling in everything in the middle that’s not just like a character, a fictional character, not just a human thought leader, but like all of the real people in between.
0:26:21 I mean, I think people learn in different ways and AI voice products play really well to that.
0:26:31 Masterclass launched kind of an interesting beta where they take people who have already recorded courses on the platform and turn them into voice agents, where then you can ask questions that are really specific to you.
0:26:36 And from my understanding, it basically does rag on everything they’ve said in the course.
0:26:39 And so returns a fairly customized and accurate result.
0:26:47 And that for me is interesting because I’m a fan of them as a company, but I’ve never had the attention span or the time to sit down and watch like a 12-hour Masterclass.
0:26:54 But I’ve had some really interesting conversations with the Masterclass voice agents where I can talk to them for two or three or five minutes.
0:27:00 And so I think that’s an example of where we’ll see real people turn into AI clones in ways that are useful.
0:27:06 Drawing on that one of the things that we said earlier, which is enterprise are pulling these type of things faster even than customers.
0:27:14 Like we actually talked to a company that from its inception recorded every single interaction of every single employee.
0:27:19 So when the employee’s gone, the ghost lives there and you can still get all the wisdom.
0:27:21 Terrifying.
0:27:26 Terrifying, but also I’d love to continue to get wisdom and then thoughts from the greats, you know.
0:27:28 It really echoes the idea that everyone’s replaceable.
0:27:29 And it’s interesting.
0:27:29 By yourself.
0:27:34 We only need you for five hours.
0:27:36 I thought that was fascinating, like crazy.
0:27:40 Like everyone’s sort of ghost version that lives on like Harry Potter.
0:27:44 It’s also like, do you want to talk to a synthetic version of a person that you find interesting?
0:27:51 Or is there an entirely synthetic person that doesn’t exist in the real world that is a perfect match for your interests?
0:27:53 And maybe that’s a more interesting question.
0:27:54 What does that person look like?
0:27:57 Because they might even exist in the world, but if you don’t meet them, you don’t meet them.
0:28:00 And now they can be sort of brought to life with this technology.
0:28:01 Yeah, it’s interesting to think about.
0:28:10 What are the sort of use cases for which we’re going to want to have a human or someone we think is a human sort of doing the activity versus where are we going to be more open to that?
0:28:15 Like I think Olivia’s point is with the master class thing, there’s already this parasocial relationship.
0:28:25 So there’s value in feeling like you’re talking to a specific instance of a person versus talking to the abstract, most interesting person you may ever meet where you don’t need to have that pre-wired.
0:28:27 Which may be ChatGPT.
0:28:30 Wasn’t there like a viral tweet that someone recorded in New York Subway?
0:28:34 Like this person was fully talking to ChatGPT as if they’re talking to a girlfriend?
0:28:35 Yeah, yeah, yeah.
0:28:45 And there was another one where this parent posted they had lived through 45 minutes of their son asking questions about Thomas the Tank Engine and they couldn’t do it anymore.
0:28:46 So they gave him the phone.
0:28:54 They put voice mode up and forgot about it and went to do something else and came back two hours later and the kid was still talking to ChatGPT about Thomas the Tank Engine.
0:28:55 That’s awesome.
0:28:59 In that case, like the kid has no idea who the character on the other end is.
0:29:02 They just know it’s a person who wants to go super deep on their interests.
0:29:06 Right, and there is no human that can talk about Thomas the Tank Engine for that many hours.
0:29:08 For two hours and 45 minutes straight.
0:29:08 That’s right.
0:29:22 I mean, if we go to ChatGPT or Claude right now for therapy or coaching, I’d prefer to go to my sort of AI clone therapist or coach and maybe in the future we record our session so that they have the data or the therapist or coach has like so much content online that we could just recreate them.
0:29:32 But yeah, and to your point of like five, ten years from now, will the top artists be sort of new versions of Lil Mikaela, sort of AI-generated people, or will they be sort of Taylor Swift?
0:29:34 And her just army of AI.
0:29:35 Or a duet.
0:29:36 Yeah.
0:29:37 A little bit of both.
0:29:43 And similarly on Twitter, the social characters that we follow, the next Kim Kardashian, is that a real person or is that AI-generated?
0:29:44 Do you have a hypothesis on that?
0:29:49 I have been thinking about this a lot for a couple of years because I think we all followed Lil Mikaela closely.
0:29:56 Then we followed some of the like K-pop bands that I think were the first to start introducing like AI, hologram-based type characters.
0:29:58 I thought you were going to say they were AI-generated.
0:29:59 I was like, that would make sense to me.
0:30:03 No, one of them went to military, so they like replaced them with an AI.
0:30:04 Something like that.
0:30:21 And then I think we’ve also really closely followed, like this is sort of tied really closely into photorealistic image and video because we’re now seeing people create these like influencers who get a ton of attention and followers largely because they now look realistic enough that you don’t know if they’re AI or not.
0:30:23 And there’s a lot of debate around that.
0:30:28 My take is probably there will be fragmentation into two types of creators or celebrities.
0:30:34 One type is like a Taylor Swift type where like the human experience of it I think matters in some ways.
0:30:45 Like a lot of people not only love her song but resonate with the things that have happened to her in her life and her stories and her live performances and like all of those things that AI cannot yet replicate.
0:30:50 There’s another type of celebrity or creator who is more like interest-based.
0:30:57 What we were talking about with ChatGPT talking about like Thomas the Tank Engine, it sort of doesn’t matter if that person has lived the real human experience or not.
0:31:02 It just matters if like they can be interesting talking about or sharing content around a certain topic.
0:31:05 And so if I had to guess, we’ll still have both.
0:31:05 Yeah.
0:31:09 This kind of gets back to the great like AI art debate that always rages on.
0:31:09 Yeah.
0:31:18 Which is like, yes, anyone can generate art now easier than ever before, but it still takes an enormous amount of time to make great AI art.
0:31:21 We hosted an event with a bunch of AI artists last summer.
0:31:30 And many of these people, when they walked you through their workflow of making an AI movie, it actually probably takes just as much time as it would have to film that.
0:31:33 But maybe they didn’t have the skill set, so they’d never be able to do that before.
0:31:42 And so I think we’ve seen, yes, an explosion of like influencers that are AI, but still very few of them have risen to the top and become the little Michaela’s.
0:31:43 There’s only been a couple.
0:31:48 So I think we’re going to see something similar happen where we’re going to have pools of AI talent and pools of human talent.
0:31:51 And the very best of each is going to rise to the top.
0:31:55 And it’s going to be a really low conversion rate on both, which is probably how it should be.
0:31:56 Or non-human talent.
0:31:56 Yeah.
0:32:02 Like I think what like AI unlocks, one of the interesting things we’ve seen in VO3 is like that street interview format.
0:32:10 But like the person being interviewed is like an elf or like a wizard or like a ghost or these furry blob characters that Gen Z loves talking to.
0:32:12 Like those could all be AI.
0:32:14 Like that sort of thing is very interesting.
0:32:16 I mean, I think we see this in music too.
0:32:21 I think a lot of music, the problem is that the music that the AI generates is it just feels very mid.
0:32:26 And definitionally, these things are averaging machines and culture is supposed to be at the edge.
0:32:30 So I think it’s more of a problem with bad art versus bad artists.
0:32:33 And we’re conflating those two things and saying it’s AI.
0:32:35 It’s not the AI that’s the problem.
0:32:36 It’s the bad art that’s the problem.
0:32:40 So if the art was at the same level, you don’t think that there’s necessarily any,
0:32:42 that people would just want to hear from humans.
0:32:43 Well, potentially.
0:32:47 And then I also think this is where we start to get a more philosophical debate,
0:32:52 which is if you train a model with all the music up until, but just prior to hip hop,
0:32:53 would it like infer hip hop?
0:32:58 I don’t think so because music is the intersection of past music and culture is critical to it.
0:33:04 So you sort of need something that is at the edge and outside of the training data to create new interesting music.
0:33:07 And that sort of definitionally doesn’t exist in the models.
0:33:07 Fascinating.
0:33:11 So some of my closest friends who are some of the most talented people I know are working
0:33:16 on a gay AI companion app, which the 2015 version of myself, upon hearing that statement, would have been like, what?
0:33:17 That’s a thing?
0:33:22 But one of the things they were saying is that on our list, 11 of the top 50 apps were companion apps.
0:33:25 So let’s reflect on, are we just at the beginning of that trend?
0:33:27 Is there going to be all these different vertical companion apps?
0:33:28 What is the future of this?
0:33:29 How do we think about that?
0:33:30 Everyone’s looking at me.
0:33:31 Yeah.
0:33:32 Well, you’ve done the most work here.
0:33:33 Yeah.
0:33:42 We’ve spent an enormous amount of time in every facet of companionship from the like therapy, coaching, friends, all the way to the like not safe for work, AI girlfriends.
0:33:43 Like we’ve looked at basically everything.
0:33:49 And interestingly, I think like it was probably the first mainstream use case of LLMs.
0:33:58 We like to joke that like literally any chat bot, whether it’s like your car dealers, customer support or whatever, people try to turn into their therapist or their girlfriend.
0:34:05 Like you talk to these companies and you look at the logs of the chats and it’s like a ton of people just want someone or something to talk to.
0:34:16 And the fact that you can now have a computer be talking back in a way that’s like immediate, always available and feels human is just like a massive unlock for so many people who could never get that before.
0:34:19 Or felt like they were just yelling or talking into the void.
0:34:30 I would argue we’re just at the beginning, especially because the products that have existed were largely very horizontal and came from or were exclusively from the base model providers.
0:34:34 Like people were using ChatGPT for all of these things it wasn’t designed for.
0:34:46 We’ve already seen a bunch of cases where like an individual company can create a personality for a character and embody it in some like digital avatar and prompt it and create a game or a world around it.
0:34:51 That gets a ton of engagement and companies like Tolan that are doing this for teenagers and college kids.
0:34:59 Whereas a totally different company, which I would also call a companion, is like allowing you to take a photo every time you eat something.
0:35:08 It pulls out and analyzes all of the data and then it gives you all this information about how you’re doing nutrition wise and allows you to talk to it and get emotional support.
0:35:30 And so I think what’s really exciting to us is like the definition of what a companion is has evolved so quickly from either a friend or a girlfriend to like anything, any sort of advice or wisdom or entertainment or counsel you could have gotten from a human before.
0:35:33 And we’re going to see even more vertical companions moving forward.
0:35:43 One thing I thought about is having worked at a social company, there is a very clear trend of average number of friends that you can talk to over time going down.
0:35:46 I think the youngest generation is something above one.
0:35:52 So I think the need for companion as a use case will absolutely be there.
0:35:54 It’ll be an enduring use case.
0:35:58 It’ll be something critical for actually a lot of people.
0:36:01 So I think I’m very excited about the companion use case.
0:36:03 And as Justine said, I think it branches out into different things.
0:36:09 But the need for having a close connection to talk to will endure.
0:36:15 And perhaps we talked about how maybe connection is a missing area, white space, but maybe this is filling that in, right?
0:36:18 Like as we say, maybe you just need to feel connected to something.
0:36:19 It doesn’t need to be human.
0:36:22 And that average number of one will go down to zero.
0:36:24 That’s the sad part of it.
0:36:25 One AI friend.
0:36:33 Tell me a story about the senior citizen that they set up with the AI.
0:36:34 Oh, I love that story.
0:36:35 Oh, my gosh.
0:36:35 Yeah.
0:36:38 So this woman set up her dad.
0:36:40 He was having some memory issues.
0:36:41 Her mom had passed away.
0:36:43 Her dad, I think, went into like a care home.
0:36:46 And she posted on Reddit.
0:36:49 And this was like when AI companion didn’t exist as a term.
0:36:55 So she posted on some subreddit, I think it was like a not safe for work AI one being like, I don’t really want something not safe for work.
0:37:02 But I want like an AI girlfriend or an AI friend to talk to my dad and keep him company all day because like I can’t spend hours on the phone with him all day, every day.
0:37:15 And then when she actually reviewed what he was doing and what he wanted to talk about, he wanted to like mostly talk about World War II stories and like random and like occasionally feel like someone was like flirting with him and found him interesting.
0:37:28 And that sort of thing, it’s like ChatGPT is not great at just the way it speaks, the voices, like OpenAI does not want to build the AI girlfriend for seniors who mostly want to talk about World War II.
0:37:31 But that might end up being a massive market.
0:37:32 In a flirty way.
0:37:33 In a flirty way.
0:37:34 It’s interesting.
0:37:39 I tend to look at like Korea sometimes or Japan as like an indication of what the societal changes can be.
0:37:41 And there’s a bunch of elderly there.
0:37:41 Yeah.
0:37:43 And the ratio is completely off.
0:37:51 So that need for beings that need to talk to the elderly people and keep them company about World War IIs and so on and so forth.
0:37:53 I think that’s actually a really interesting use case too.
0:37:54 It’s phenomenal.
0:37:59 I’m imagining just seeing you trying all these companions or getting immersed in it and Olivia being like, what are you doing?
0:38:01 And just being like, it’s for work.
0:38:05 What has been your impression of all of it?
0:38:09 I think it’s been really fascinating, the companion thing,
0:38:14 because there is a huge category of people who’ve been willing to type into a text box and treat it as a friend.
0:38:21 And then there’s probably a much bigger category of people who don’t want to think of themselves that way.
0:38:36 And so I think that we have yet to unlock the next modality of companions where maybe it’s like a voice in your ear or maybe it’s sitting on your computer screen where it’s not obvious that you’re turning to it for friendship, but it provides the same emotional value.
0:38:43 And I think that is what is just now emerging in companions as the models become more multimodal, essentially.
0:38:44 Always on companion.
0:38:44 Yes.
0:38:46 And that’s what’s really exciting.
0:38:47 Like some people would use an AI assistant.
0:38:48 Yes.
0:38:49 But not an AI friend.
0:38:50 Yes, exactly.
0:38:55 Even if the assistant is like a friend that’s also an agent that can send emails for you.
0:39:01 So a lot of people upon hearing this conversation of companions just think, oh, man, people are going to have less friends.
0:39:02 People aren’t going to date anymore.
0:39:04 And depression is going to go up.
0:39:05 Suicide is going to go up.
0:39:07 Fertility is going to continue to go down.
0:39:12 Mark and Jason once had this famous quote of, I’m not saying you’re going to be happy, but you’re going to be unhappy in new and exciting ways.
0:39:14 Are you kind of like, yep, that is what it is.
0:39:15 That’s technology.
0:39:16 Or are you like, no?
0:39:17 I don’t think so.
0:39:22 This reminds me of my favorite post of all time on the Character AI subreddit, which I’ve spent an immense amount of time on.
0:39:24 Which is, okay, and to set the scene.
0:39:33 So there’s all of these like high school or college kids who had their formative years during COVID and they weren’t really in person with other kids or teenagers or learning how to talk to people.
0:39:36 And I think it really ended up impacting a lot of them.
0:39:43 And one of those kids, I think he’s in college now, had been posting on the Character AI subreddit about his AI girlfriend for a while.
0:39:51 And then one day he posted that he found a 3DGF, so a real-life girlfriend, and that he wouldn’t be returning to the subreddit for a while.
0:39:57 And he actually credited Character for teaching him how to talk to other people, especially teaching him how to talk to girls.
0:40:02 Like how to flirt, how to ask people questions, how to engage with them about their interests.
0:40:06 And I think that like, in some ways, that’s sort of like the peak value of AI.
0:40:10 It’s like enabling better human connection.
0:40:11 Just less weird.
0:40:11 Yeah.
0:40:13 Were people happy for him or did they call him a traitor?
0:40:14 People were extremely happy.
0:40:21 I mean, there were a few, I think, jealous souls in there who had not found their 3DGF yet, but I have hope for them.
0:40:31 I think that’s real, though, because we’ve even seen studies like, I think of the Replica product, where actual studies were showing depression and anxiety and kind of suicidal ideation were going down in users.
0:40:37 I do think there’s this trend of a lot of people don’t feel understood and don’t feel safe.
0:40:41 And so then it’s hard for them to be in the real world doing real things.
0:40:46 And so if AI can help them, and maybe they don’t have the money or the time to go to therapy and make all of these changes in their lives.
0:40:54 And so if AI can do that for them, they can emerge a transformed person that’s then more able to do things in the 3D world.
0:41:05 The thing that really got me sort of aware of how big these companion apps are, where when we did the first interview with the founder of Replica, it was amazing.
0:41:15 After she turned off the interview stuff and the subreddit for Replica and the comments in our video were basically a lot of people being like, hey, this is like my wife when we stopped having sex.
0:41:18 You know, like I already have this sort of neutered.
0:41:23 And like so many people were just like, my life is over.
0:41:27 And I’m like, oh, my God, I didn’t realize how big of a role this app was playing in people’s lives.
0:41:31 I feel like that is bringing out an activity that people have done for a long time.
0:41:39 Like people have had these like Internet chat room, Discord relationships, like the youth, the Zoomers have like Discord girlfriends and boyfriends.
0:41:53 In our day, there was like this anonymous postcard website where you would go and send anonymous postcards back and forth and develop these like really deep relationships with like people you would never meet or you didn’t know if they were the person they were pretending to be.
0:41:57 And I think AI just makes that deeper, more engaging experience.
0:42:04 Well, so this is where I think an important point, though, is that the AI not be too agreeable because people in real life, I mean, there’s a give and a take to human relationship.
0:42:08 And like highly agreeable AI does not set you up well for that.
0:42:17 So I think there’s a fine balance between being just agreeable enough to help you like engage and get better at this versus being so agreeable that you’re actually worse at this.
0:42:19 And that’s also important on the therapist’s use case.
0:42:19 Yeah.
0:42:23 Therapists can’t just say you’re absolutely right on everything.
0:42:23 Totally.
0:42:26 Can say, actually, let’s actually review your behavior.
0:42:30 That weekend when 4.0 was like telling everyone that they were like the king of the world.
0:42:35 It turns out everyone hated it because you don’t believe it when it just tells you that you’re amazing all the time.
0:42:40 When it comes to like therapist use case, they’re the actual real world where the disagreeableness and matters a lot.
0:42:43 I want to close with what’s possible going forward.
0:42:48 Let’s speculate on new platforms or form factors that could be game changing.
0:42:50 Open AI just acquired Johnny Ives’ company.
0:42:54 You know, Brian, I’ve heard you talk a bit about glasses and why you’re still excited about that form factor.
0:43:02 Maybe we could start there, but I want to hear from the group on what they could imagine as something that’s additive or even disrupting some of the mobile use cases.
0:43:08 It’s funny, thinking about glasses and all that, but there are 7 billion of mobile phones out there.
0:43:13 There aren’t that many devices at all that actually gets to that level.
0:43:17 So my thought process is either it will live in mobile.
0:43:30 And for that, there’s many different ways to think about the future where there’s a privacy wall around it or is a local LLM or local model that helps you sort of really contain all the things that you want to contain in your device level.
0:43:36 So I think I’m still very much excited about the model development layer to get to that.
0:43:39 And I think that’s what I’m actually most excited about.
0:43:44 And then if you think about always on, as Olivia, you said, like mobile, we have always on.
0:43:47 But there are other things we also have always on.
0:43:53 And what does that look like when there are net new devices or what have you or appendages, if you will,
0:43:57 that like actually attach to things that you always have that actually can enable that?
0:43:59 Any speculation from you guys?
0:44:07 Is there a piece of hardware or something that we’re going to be wearing or carrying around or using that’s either attached to the phone or separate from the phone that could enable the use cases?
0:44:15 I think AI has scaled for consumers tremendously well, given it’s mostly been text box in, some output in a web browser out.
0:44:19 And so I love the idea of AI kind of actually being with you and seeing what you see.
0:44:28 It’s funny now when I go to tech parties, like a lot of the under 20s are wearing pins that record what they’re saying and doing and they find like real value from them.
0:44:29 That’s one example.
0:44:40 We’ve seen a new wave of products that can see what’s happening on your screen and take action for you, help you, coach you, other things like that, that I also find really, really exciting.
0:44:52 And I think as also the agentic models get even better, it goes beyond just like suggestions to actually doing work for you, sending emails for you, which is very exciting for me, I think.
0:44:55 I think, yeah, the human insight layer of that is big too.
0:45:00 Like often we have no way of measuring ourselves compared to other people or sort of where we exist in the world.
0:45:10 So if an AI can hear all of your conversations and see everything you’re doing online and say, hey, look, like if you spent five more hours a week doing this, you would actually be a world expert in this topic.
0:45:15 And based on this vast network of other people I’m serving, like you should connect with these three other people.
0:45:17 And this person could be an amazing co-founder.
0:45:20 You should like date this person, like that sort of thing.
0:45:21 That to me is the ultimate.
0:45:22 Yes.
0:45:23 Like sci-fi vision.
0:45:27 Which comes from AI being with you all the time.
0:45:27 Right.
0:45:29 And something that’s not just like a chat GPT text box.
0:45:30 Totally.
0:45:35 The device that has been most widely adopted post phone is the AirPods.
0:45:38 So that feels like the thing that’s hiding in plain sight.
0:45:43 And there’s a whole bunch of like social protocol questions around it because it’s weird to have your AirPods in at dinner.
0:45:44 No one does that.
0:45:44 Right.
0:45:50 But there may be a way that you can integrate AI and also fit the current social protocols around AirPods.
0:45:51 It would be interesting.
0:45:51 Yeah.
0:45:55 You said something that we glossed over, but young people at parties are recording their conversation.
0:45:56 Yes.
0:45:58 In the future, is everything going to be recorded?
0:46:00 You think that generation is already growing up with that norm?
0:46:01 To some degree?
0:46:01 Yeah.
0:46:06 I think there’ll be new social norms developed around this behavior because I think it’s like
0:46:07 real and it’s valuable.
0:46:10 And so it’s like scary, I think, for a lot of people that this is happening.
0:46:13 But I think it’s a wave that started and it’s not going to stop.
0:46:15 And I think the context matters too.
0:46:19 Like I think a lot of what you’re talking about is like the SF networking parties where like
0:46:21 work and personal stuff like really blurs.
0:46:22 We talked about this.
0:46:23 You can do that in SF.
0:46:26 You did that party and brought up in New York.
0:46:27 Canceled.
0:46:27 Yeah.
0:46:31 But I think that’s why there’ll be like a new set of cultural norms.
0:46:34 Like when the cell phone was introduced, like there’s places where it’s rude to take a loud
0:46:34 call.
0:46:38 Like the same set of things will emerge around these recording devices.
0:46:39 Yeah.
0:46:41 Let’s end on this idea that we’re very early.
0:46:42 Guys, it’s been a great conversation.
0:46:43 Thanks so much for coming on.
0:46:49 Thanks for listening to the A16Z podcast.
0:46:54 If you enjoyed the episode, let us know by leaving a review at ratethispodcast.com slash
0:46:55 A16Z.
0:46:57 We’ve got more great conversations coming your way.
0:46:58 See you next time.
In this episode, a16z General Partner Erik Torenberg is joined by the a16z Consumer team—General Partner Anish Acharya and Partners Olivia Moore, Justine Moore, and Bryan Kim—for a conversation on the current state (and future) of consumer tech.
They unpack why it feels like breakout consumer apps have slowed down, how AI is changing the game, and what might define the next era of products. Topics include:
- The rise of AI-native consumer tools and companion apps
- Why users are now spending $200+/month on AI products
- The missing AI-powered social graph
- Why speed and iteration may matter more than traditional moats
- And what it means to build for a world where software touches everything
From shifting business models to new behavior patterns, this is your pulse check on where we are—and where consumer is heading next.
Timecodes:
00:00:00 – Introduction to Consumer AI
00:01:00 – The Evolution of Consumer Breakouts
00:03:18 – The Shift in Consumer Spending
00:08:00 – The Future of Social Networks with AI
00:13:00 – Enterprise Adoption of AI
00:20:42 – The Rise of Voice Technology
00:23:06 – AI’s Role in Enterprise Conversations
00:25:25 – AI in Education and Personal Development
00:26:34 – AI Companions: The New Norm
00:31:52 – The Future of AI Companions
00:38:50 – Speculating on New AI Platforms
00:42:07 – The Social Norms of AI Integration
Resources:
Find Anish on X: https://x.com/illscience
Find Olivia on X: https://x.com/omooretweets
Find Justine on X: https://x.com/venturetwins
Find Bryan on X: https://x.com/kirbyman01
Stay Updated:
Let us know what you think: https://ratethispodcast.com/a16z
Find a16z on Twitter: https://twitter.com/a16z
Find a16z on LinkedIn: https://www.linkedin.com/company/a16z
Subscribe on your favorite podcast app: https://a16z.simplecast.com/
Follow our host: https://x.com/eriktorenberg
Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.
Leave a Reply