AI transcript
0:00:06 and Shopify.
0:00:11 Teachable makes it easy for creators to monetize their content with full control.
0:00:15 Head to teachable.com and use code “PROFITING” to claim your free month on their pro-paid
0:00:16 plan.
0:00:20 Grow your real estate investments in minutes with the Fundrise flagship fund.
0:00:26 Add the Fundrise flagship fund to your portfolio with as little as $10 at fundrise.com/profiting.
0:00:29 Save big on wireless with Mint Mobile.
0:00:35 Get your new three-month premium wireless plan for just $15 a month at mintmobile.com/profiting.
0:00:39 Unlock your team’s potential and boost productivity with Working Genius.
0:00:45 Get 20% off the $25 Working Genius Assessment at workinggenius.com with code “PROFITING”
0:00:46 at checkout.
0:00:49 Attract interview and hire all in one place with Indeed.
0:00:53 Get a $75 sponsored job credit at indeed.com/profiting.
0:00:55 Terms and conditions apply.
0:00:59 Shopify is the global commerce platform that helps you grow your business.
0:01:04 Sign up for a $1 per month trial period at Shopify.com/profiting.
0:01:09 As always, you can find all of our incredible deals in the show notes.
0:01:11 Routine work is going to go away.
0:01:15 That’s going to free us up to be creative as entrepreneurs.
0:01:22 I’m expecting people to declare that it’s like them and their AI starting this new company.
0:01:26 You don’t have to learn the language of a computer to interact with it anymore.
0:01:34 We’re just about to embark on a completely new phase of the digital journey where computers
0:01:36 are now learning to speak our language.
0:01:40 It will increasingly be able to see what you see.
0:01:45 It now lowers the barrier to entry to get access to support and information.
0:01:46 And that is what is going to be the level up.
0:02:07 Yeah, fam, welcome back to the show.
0:02:12 And today we have an incredibly insightful conversation about AI.
0:02:15 AI is completely transforming the world.
0:02:19 You’re probably using AI every single day.
0:02:23 But today we’re going to learn even more about AI than we’ve ever heard before because
0:02:28 I’ve brought on one of the pioneers of AI, Mustafa Suleiman.
0:02:33 Mustafa Suleiman is the co-founder of DeepMind, the co-founder of Inflection AI.
0:02:38 He formerly worked at Google, and now he’s the CEO of Microsoft AI.
0:02:42 We’re going to learn about all the new developments with Microsoft’s co-pilot product.
0:02:46 And Mustafa is going to absolutely blow your mind.
0:02:50 We’re going to learn so much more about AI than we ever have on the show, and we talk
0:02:56 about the difference between AGI, artificial general intelligence, and the narrow AI that
0:02:57 a lot of us are familiar with.
0:03:03 We’re going to learn how AI is becoming more human-like than ever, and how AI is going
0:03:08 to truly become our co-pilot, helping us in work and life and even potentially one day
0:03:11 being our co-founders of our businesses.
0:03:15 Mustafa also is going to cover the containment challenge related to AI, and we’re going
0:03:21 to learn how we need to make sure we put up guardrails so that AI works for us and not
0:03:23 against us as a human society.
0:03:25 I’m so excited for this conversation.
0:03:27 I really think you guys are going to love it.
0:03:28 I think you guys are going to learn so much.
0:03:33 Even if you’re a pro at AI, I guarantee you Mustafa’s future predictions are going to
0:03:35 really open your mind.
0:03:39 So without further ado, here’s my conversation with Mustafa Suleiman.
0:03:43 Mustafa, welcome to Young and Profiting Podcast.
0:03:44 It is great to be here.
0:03:45 Thanks so much for having me.
0:03:47 I’m so excited for this interview.
0:03:53 My listeners love to learn about AI, so you have been a pioneer in AI.
0:03:55 You founded DeepMind, co-founded it.
0:03:57 You co-founded Inflection AI.
0:04:03 You’re now leading Microsoft AI, and I read your book, The Coming Wave, and one of the
0:04:06 things that you say is that AI has a lot of threats.
0:04:10 Just even you say a threat to global world order.
0:04:12 So my first question is to you.
0:04:15 Are you an optimist or a pessimist when it comes to AI?
0:04:20 I think that to be rational, you have to be both.
0:04:27 And if you just sit on one side or the other, you’re probably missing an important part
0:04:28 of the truth.
0:04:34 Because I think wisdom in the 21st century is being able to hold multiple confusing or
0:04:40 contradictory ideas in working memory at the same time and navigate a path through those
0:04:46 things which doesn’t leave you sort of characterizing those who disagree with you or somehow see
0:04:51 things slightly differently in a negative or unconstructive way.
0:05:00 And so it is true that this AI moment is going to deliver the greatest boost to productivity
0:05:03 in the history of our species in the next couple of decades.
0:05:05 That to me is unquestionable.
0:05:07 I don’t think that makes me an optimist.
0:05:11 I think that makes me a good predictor of the underlying trends.
0:05:18 At the same time, it is also going to create the most change in a disruptive, positive and
0:05:22 negative version of disruptive way we’ve ever seen.
0:05:26 And that is going to be incredibly destabilizing to the way that we currently understand the
0:05:27 world to be.
0:05:32 The way that we work, the way our politics operates, fundamentally what it even means
0:05:33 to be human.
0:05:36 So we’re about to go through a revolution.
0:05:43 And I think that in that world, I’m both optimistic and there are parts of me that do feel pessimistic.
0:05:47 So one of the things that you talk about related to powerful technologies is a containment
0:05:48 problem.
0:05:50 And you say it’s going to be one of the biggest challenges that we have.
0:05:55 So what is this containment problem and what are the challenges with containing technologies
0:05:56 like AI?
0:06:04 Yeah, so in the past, the goal has been to invent things and have science make our lives
0:06:06 calmer and happier and healthier.
0:06:11 And the race has been, how do we unlock these capabilities as fast as possible?
0:06:16 We want to create and produce and invent and solve challenges.
0:06:22 And that’s what we did with steam and oil and electricity and wind and food systems.
0:06:26 And we’ve seen this explosion of creativity, certainly in the last couple of centuries,
0:06:29 but that’s also been the history of our species.
0:06:36 So the goal there has been proliferate, spread it as far and wide as possible so that everyone
0:06:39 can enjoy the benefits immediately.
0:06:45 Now the thing that I’ve speculated about, which is the containment hypothesis, is that
0:06:51 if we do that this time with AI technologies in a completely unfiltered way, then that
0:06:58 has the potential to empower everybody to have a massive impact on everybody else in
0:07:00 real time.
0:07:04 So we aren’t just talking about spreading information and knowledge, that’s one part
0:07:10 of it, but increasingly your AI agents or your co-pilots will be able to actually do
0:07:13 things in the real world and in the digital world.
0:07:18 And the cost of building one of those things is going to be zero marginal cost, right?
0:07:21 So everyone’s going to have access to them in 20 years.
0:07:23 Maybe even much earlier.
0:07:27 And that’s going to completely change how we get things done, how we interact with
0:07:31 one another and potentially cause a huge amount of chaos.
0:07:36 And so a little bit of friction in the system could be our friend here.
0:07:40 That’s really all that I’m positing in the containment hypothesis.
0:07:45 Containing things so that we can collectively as a society think carefully about the potential
0:07:50 future consequences and the third order effects, that seems like a rational thing to do at
0:07:51 this point.
0:07:53 That makes a lot of sense.
0:07:59 So what are your thoughts around AI and a problem surrounding surveillance and bias
0:08:00 and things like that?
0:08:05 There are lots of potential ways that this technology gets misused.
0:08:09 And in some respects, this concentrates power.
0:08:15 It makes it easier for a small group of people to see what is happening in an entire ecosystem,
0:08:22 i.e. understand the faces and the images and the patterns of massive crowds of people
0:08:24 walking around in our societies.
0:08:26 And that certainly can be used for a lot of good, right?
0:08:32 I mean, we want to have a police force that is capable of instilling order and structure
0:08:33 and so on.
0:08:41 On the flip side, that also makes it easier for authoritarians, despots of any kind to
0:08:45 identify minority groups that they want to get rid of.
0:08:50 And it’ll just be much easier to find the needle in the haystack.
0:08:56 And again, that has amazing benefits because you can catch the bad guy, but it also is
0:08:57 potentially scary.
0:09:03 And so each one of these steps forward has that balancing act between the harms and the
0:09:04 benefits.
0:09:09 So let’s talk about the benefits, because I’ve been asking you a lot about the negative.
0:09:14 So how can AI help us solve things like world hunger and poverty and things like that?
0:09:20 Well, one way of thinking about it is that intelligence is the thing that has made us
0:09:23 productive and successful as a species.
0:09:29 It’s not just our muscle or our brawn, it’s actually our minds and our ability to make
0:09:35 predictions in complex environments to solve hard problems.
0:09:39 This is really the essence of what makes us special and creative.
0:09:42 We learn to use tools and we invent things.
0:09:48 So that intelligence, that technique of being able to predict what’s going to happen next,
0:09:55 that is actually something that we’re increasingly learning to automate and turn into an intelligence
0:09:59 system that can be used by everybody.
0:10:00 So what does that mean?
0:10:07 That means, okay, well, we now have this very thing that made us smart and productive and
0:10:09 created civilization.
0:10:15 That very concept, intelligence, is going to be cheap and abundant.
0:10:22 Just like energy, frankly, oil has turbocharged the creation of our species and we now have
0:10:28 seven billion people on the planet, largely as a result of the proliferation of oil.
0:10:36 The next wave is that everybody is going to get access to personalized real-time knowledge
0:10:44 and a companion, an aide, a coach, a guide, a co-pilot that is going to help you get things
0:10:45 done in practice.
0:10:49 It will help you create and invent and solve problems and get things done.
0:10:52 And so that is a massive force amplifier.
0:10:58 So for everybody who wants to solve world hunger or invent new energy systems or solve
0:11:04 the battery challenge so that we can really unlock the power of renewables, we now are
0:11:09 going to have a super intelligent aide at your fingertips that’s going to help you work
0:11:10 through those problems.
0:11:15 It’s so cool to think about and as you’re talking, I think about the iPhone, right?
0:11:17 We had that in our pocket.
0:11:22 It’s not personalized AI, it’s not an AI companion or anything like that, but it gives us access
0:11:24 to knowledge of the world.
0:11:29 It really has moved us forward as humans and now there’s like another wave coming and it’s
0:11:30 pretty exciting.
0:11:35 I feel like we’re already ready for it because we’ve had stuff like the iPhone and the Internet,
0:11:36 would you say?
0:11:37 Yeah, you’re totally right.
0:11:43 And in many ways, the iPhone’s impact was just as hard for people to predict.
0:11:50 If you said to people in 1985, what would the internet enable?
0:11:56 Very few people would have predicted mobile phones with real-time communication, video
0:12:04 streaming, a camera and a microphone in your pocket on the table, almost in every room.
0:12:06 That would have sounded terribly dystopian and scary.
0:12:11 And yet because of the friction, because it’s taken us a couple of decades to really figure
0:12:17 it out and get it working in practice, we’ve created boundaries around these things.
0:12:22 We’ve got security, we have encryption, there are privacy standards, there are real regulators
0:12:28 that make meaningful interventions and there’s public pressure and there’s a reaction and
0:12:34 it steadies the kind of arrival of these technologies in a really healthy and productive way.
0:12:35 And it’s kind of mind-blowing.
0:12:36 You’re right.
0:12:37 Who would have thought what the iPhone would have done?
0:12:38 Yeah.
0:12:40 Well, I’m super excited about everything.
0:12:42 I do want to touch on your background a bit.
0:12:46 So you co-founded DeepMind and something that we have in common is that we’re both really
0:12:48 passionate about human rights.
0:12:51 I was telling you offline, I’m Palestinian, 100%.
0:12:56 So top of mind for me, especially this year, you’re Syrian and I know that there’s just
0:13:00 global issues going on in the world that makes us top of mind.
0:13:04 And you were a human rights activist long ago, that’s how you started your career.
0:13:10 So talk to me about how your experience with human rights and activism really helped shape
0:13:16 your vision and some of your ethical decisions related to DeepMind.
0:13:18 Thanks for the question.
0:13:25 Human rights is core to what I believe to be the solution for a peaceful and stable society.
0:13:31 Your success and my success is probably largely a function of the privilege that we have to
0:13:38 live in a society not at war, have access to a great education system, be able to get
0:13:44 access to healthcare, and really just get the chance to be who we want to be instead
0:13:49 of growing up in societies that are maybe in the middle of a refugee camp, going through
0:13:56 war, in the middle of complete chaos, families that get moved on month after month or year
0:13:59 after year and essentially end up being refugees.
0:14:06 And I think that more than anything, we have to have so much more empathy and kindness for
0:14:08 people who have been through that struggle.
0:14:14 There tends to be a sort of demonizing of refugees or migrants as though they’re like
0:14:17 coming to steal something from our stable worlds.
0:14:20 But in fact, they’re actually fleeing insane hardship.
0:14:25 So we have to be way more compassionate and forgiving and kind to those kinds of people.
0:14:29 And remember that those people are just like us.
0:14:35 Our societies could fall apart in the US, in the UK, in Europe, just as their societies
0:14:39 happen to be falling apart at this moment in time.
0:14:42 And we have to extend a hand of friendship and love.
0:14:47 And the human rights framework taught me that because I grew up as a pretty strict Muslim
0:14:55 and I realized, sort of in my late teens and early twenties, that it was too narrow a kind
0:15:00 of worldview, prioritized being Muslim over being human.
0:15:03 And it didn’t make sense to me that there wasn’t gender equality.
0:15:09 It didn’t make sense to me that people who chose to get with a member of the same sex
0:15:14 suddenly got demonized and they were like evil somehow, that none of that made sense
0:15:15 to me.
0:15:19 Whereas a human rights framework respects everybody as equals.
0:15:22 And I just can’t see how that is in the right way to live.
0:15:28 So yeah, I became an atheist and secular and a big believer in these kind of rights frameworks
0:15:29 for everyone.
0:15:34 And then how did that shape the way that you thought of AI and the way that you decided
0:15:37 to develop your technology at DeepMind?
0:15:41 Well, think about AI as a force amplifier.
0:15:49 So the question is, which force is it amplifying and which frame is it placing on the amplification?
0:15:54 So what set of values, what ideas is it putting out into the world?
0:15:55 What are its boundaries?
0:15:56 What does it not do?
0:16:01 And so technology is fundamentally an ethical question.
0:16:08 It is clearly about how we reframe our culture and our ideas and our entertainment, our music,
0:16:10 our knowledge in the world.
0:16:14 So it was very obvious to me from the very beginning that we started DeepMind that we
0:16:20 were going to have a huge moral responsibility to think about what it’s going to be like
0:16:25 to bring these agents, these co-pilots into the world.
0:16:26 What would their values be?
0:16:28 What would they not do?
0:16:29 What are their limitations?
0:16:35 That always was a big motivator for me and was a big part of the kind of structure that
0:16:40 led to our acquisition by Google and subsequent efforts over my time at Google.
0:16:43 And since then, I’ve always been very focused on that question.
0:16:47 So when it comes to your technology goals at DeepMind, you have the goal of creating
0:16:50 AGI, artificial general intelligence.
0:16:53 We’ve never talked about that on the podcast.
0:16:56 Probably 90% of my listeners have no idea what AGI is.
0:16:57 Can you explain what that is?
0:16:59 Yeah, great question.
0:17:04 This is a term that gets bounded around quite a lot in nerd niche circles and it’s sort
0:17:12 of broken out a little bit now, but basically the idea of a general purpose intelligence
0:17:19 is that it’s capable of learning in any environment and over time it will end up becoming a super
0:17:20 intelligence.
0:17:24 It will get better by learning from itself and learning from any environment that you
0:17:33 put it in, so much so that it exceeds human performance at any knowledge or action task.
0:17:40 So it could play any game, control any physical infrastructure, learn about any academic discipline.
0:17:43 It’s going to be a really, really powerful system in the future, but I do think that’s
0:17:45 a long, long way away.
0:17:51 Before we get there, there are just going to be regular AI systems or AI companions that
0:17:55 can talk to you in the same language that you or I talk to one another.
0:18:00 I’m sure lots of people have played with various chatbots like chatGPT and co-pilot, and you
0:18:02 can see it’s getting pretty good.
0:18:06 It’s still a little awkward, but it’s pretty knowledgeable.
0:18:10 We’ve been reducing the hallucinations quite a lot.
0:18:14 Lately we’ve added the real-time access to information so it can check up on the news
0:18:20 and it’ll know the weather and it’ll have a temporal awareness, and those systems I
0:18:25 think are going to be around for a pretty long time before we have to worry about a
0:18:28 big AGI, but that is probably coming.
0:18:35 So when you say hallucinations, are you talking about incorrect information coming from AI?
0:18:36 Not quite.
0:18:42 I think of a hallucination as the model inventing something new.
0:18:46 And so sometimes you want it to be creative.
0:18:53 We’re actually looking for it to find the connection between a zebra and a lemon and
0:18:55 a New York taxi.
0:18:59 And if you ask it that question, it’s going to come up with some creative poetic connection
0:19:01 that links those three things.
0:19:06 But then sometimes you just want it to be super to the point and not wander off and
0:19:10 talk all kinds of creative poetic nonsense.
0:19:12 You just want it to give you the facts.
0:19:14 So that’s a spectrum.
0:19:18 And what we’re trying to do is figure out from a query, like given the thing that you’re
0:19:25 asking the model, should we put it into a more creative mode or a more precise and specific
0:19:27 mode that’s more likely to be accurate?
0:19:30 And that’s kind of the hallucination challenge.
0:19:32 How is AGI different from narrow AI?
0:19:34 What’s the main differences?
0:19:38 What narrow AI is the conversational companions that I was describing.
0:19:43 It’s limited, it doesn’t improve on its own.
0:19:49 So you sort of train it within some boundaries, and then it gets good at a specific number
0:19:55 of tasks, like it’s good at great conversation, maybe it can do document understanding, maybe
0:19:57 it can even generate a little bit of code.
0:20:01 But we know what its capabilities are to some extent.
0:20:06 That’s a narrow version of AI when we use it for a specific purpose.
0:20:12 A more general AI is going to be one where it has things like recursive self-improvement.
0:20:18 It could edit its own code in order to get better, or it could self-improve, or it would
0:20:20 have autonomy.
0:20:25 It could act independently of your direct command, essentially, or you give it a very general
0:20:30 command and it goes off and does all sort of sub-actions that are super complicated,
0:20:35 like maybe even invent a new product and create a website for it and then set up a drop ship
0:20:40 for it and then go and market it and take all the income and then do the accounts and so
0:20:41 on.
0:20:46 I mean, I think that’s plausible in, say, three to five years.
0:20:50 Before 2030, I think we’ll definitely have that and might well be much, much sooner.
0:20:51 Could well be like a lot sooner.
0:20:54 Oh my gosh, that’s so crazy to think about.
0:20:59 So how does that challenge the way that we think of us as humans and consciousness and
0:21:00 intelligence?
0:21:03 Is it going to change the way that we think of what’s human and what’s not?
0:21:04 Yeah, for sure.
0:21:08 I mean, we are going to have to contend with a new type of software.
0:21:13 Historically, software has been input in, input out.
0:21:18 You type something on Airbnb and it gives you a search result page.
0:21:23 You play a piece of music on Spotify and that comes out as intended.
0:21:27 And software so far has been about utilities, been functional.
0:21:32 The goal is for it to do the same thing over and over again in a very predictable way and
0:21:33 it’s been really useful.
0:21:38 We’ve created trillions of dollars of business value out of it and unbelievable social human
0:21:42 connection and knowledge and all the rest of it has been incredible actually.
0:21:51 But we’re just about to embark on a completely new phase of the digital journey where computers
0:21:55 are now learning to speak our language.
0:21:59 You don’t have to learn the language of a computer to interact with it anymore.
0:22:00 I mean, you can.
0:22:05 It’s still important to be a programmer, but it can actually understand your language.
0:22:11 It can understand the audio that you give it when you do a voice input, the intonation,
0:22:16 the inflection, the volume, the pace, the pauses.
0:22:19 It will increasingly be able to see what you see.
0:22:24 So not only will you take a picture and it will recognize what’s in the picture, it will
0:22:28 have complete screen understanding of everything that you’re doing in your browser or on your
0:22:30 desktop or on your phone.
0:22:31 Game by frame.
0:22:35 You’re browsing Instagram, it’s seeing everything that you’re seeing in real time talking to
0:22:40 you about the content of what you’re interacting with on Instagram, on TikTok, when you’re
0:22:42 reading the news, whatever you’re doing.
0:22:45 So that’s a profound shift.
0:22:47 That’s not a tool anymore, right?
0:22:52 That is really starting to capture something meaningful about what it means to be human
0:22:57 because it’s using the same language that we use to understand one another.
0:23:03 All those subtle cues that take place in social bonding of human relationships, suddenly it’s
0:23:05 going to be present in that dynamic.
0:23:08 And it’s a really big deal.
0:23:13 And I think even though we’ve had a couple of years of large language models being out
0:23:16 there and people get to play with it in the open source, I still don’t think people are
0:23:21 quite grasping how big a deal this shift is about to be.
0:23:25 Let’s hold that thought and take a quick break with our sponsors.
0:23:29 Young and Profiters, buy low, sell high.
0:23:31 It’s easy to say, but it’s hard to do.
0:23:35 For example, high interest rates are crushing the real estate market right now.
0:23:39 Demand is dropping and prices are falling, even for many of the best assets.
0:23:43 It’s no wonder the Fundrise flagship fund plans to go on a buying spree, expanding its
0:23:47 billion dollar real estate portfolio over the next few months.
0:23:51 You can add the Fundrise flagship fund to your portfolio in just minutes with as little
0:24:00 as $10 by visiting fundrise.com/profiting, that’s F-U-N-D-R-I-S-E.com/profiting.
0:24:06 Again, you can diversify your portfolio with the Fundrise flagship fund at fundrise.com/profiting,
0:24:11 that’s F-U-N-D-R-I-S-E.com/profiting.
0:24:16 Carefully consider the investment objectives, risks, charges, and expenses of the Fundrise
0:24:18 flagship fund before investing.
0:24:23 This and other information can be found in the Fund’s prospectus at fundrise.com/flagship.
0:24:26 This is a paid advertisement.
0:24:29 Young Improfitors, chances are, if you’re listening to this show, you’ve got an expertise
0:24:31 that you can teach other people.
0:24:36 Chances are you can make passive income by creating your first course.
0:24:40 If you’ve been on the fence about creating a course, what are you waiting for?
0:24:45 It is so easy to launch a course these days because Teachable has got your back.
0:24:47 Teachable is the number one course platform.
0:24:52 It is simply the best platform for content creators to start or grow an online business
0:24:53 authentically.
0:24:57 That’s because Teachable offers more product options to create and sell than any other
0:25:03 platform with online courses, digital downloads, coaching services, memberships, and communities.
0:25:07 You can engage your online audience on a deeper level and you can get selling fast with easy
0:25:12 to use content and website builders plus a variety of AI tools.
0:25:17 They have an AI curriculum generator, they have automatic subtitles and translations,
0:25:22 so they’ve really figured out how to use AI to optimize our course creation and make
0:25:23 it much faster.
0:25:27 They also have a top rated mobile app that allows your customers to enjoy your products
0:25:28 on the go.
0:25:33 On Teachable, creators also get more power and ownership over monetizing their content.
0:25:37 Your content shouldn’t get lost in the algorithm and you shouldn’t have to rely on other companies
0:25:38 to pay you.
0:25:42 Now is the perfect time to join Teachable because you get an exclusive deal with our
0:25:44 partner code, Profiting.
0:25:49 Go to teachable.com and use code profiting to unlock a free month on their pro plan.
0:25:53 You’ll get all their marketing, product creation features to build out your offerings and there’s
0:25:56 no limitations on this trial, so you definitely don’t want to miss this.
0:25:58 Again, it’s teachable.com.
0:26:02 You can use code profiting for an entire free month on Teachable.
0:26:06 Be a part of the 100,000 plus creators who are already using Teachable to turn their
0:26:09 expertise into a thriving online business.
0:26:15 YapGang, I appreciate a good deal just like anyone else, but I’m not going to cross a
0:26:19 desert or walk through a bed of hot coals just to save a few bucks.
0:26:23 It needs to be straightforward, no complications, and no nonsense.
0:26:27 So when Mint Mobile said that I could get wireless service for just 15 bucks a month
0:26:30 with a three month plan, I was skeptical.
0:26:35 But it is truly that simple to secure wireless at 15 bucks a month and Mint Mobile made my
0:26:37 transition incredibly easy.
0:26:39 The entire process was online.
0:26:43 It was easy to purchase, easy to activate, and easy to save money.
0:26:47 The only lengthy part that took up any time was waiting on hold to cancel my previous
0:26:48 provider.
0:26:52 If you want to get started, just go to mintmobile.com/profiting.
0:26:56 You’ll see right now that all the three month plans are only 15 bucks a month, including
0:26:58 the unlimited plan.
0:27:02 All plans come with high speed data and unlimited talk and text, delivered on the nation’s
0:27:04 largest 5G network.
0:27:08 And don’t worry, you can use your own phone and keep your current phone number with any
0:27:10 Mint Mobile plan.
0:27:14 Find out how easy it is to switch to Mint Mobile and get three months of premium wireless
0:27:17 for just 15 bucks a month.
0:27:21 To get this new customer offer and your new three month premium wireless plan for just
0:27:25 15 bucks a month, go to mintmobile.com/profiting.
0:27:27 That’s mintmobile.com/profiting.
0:27:31 Cut your wireless bill to just 15 bucks a month at mintmobile.com/profiting.
0:27:37 $45 upfront payment required equivalent to $15 a month.
0:27:40 New customers on first three month plan only.
0:27:43 Speed slower above 40GB on unlimited plan.
0:27:45 Additional taxes, fees and restrictions apply.
0:27:51 See Mint Mobile for details.
0:27:56 You mentioned that you’re an atheist now and you’ve been central to developing AI and
0:27:59 AI is becoming more human-like.
0:28:04 Do you feel like that has altered your perspective about religion a bit?
0:28:08 You know, I think there are many amazing things about religion.
0:28:16 Religion was a way that people made up stories, this is in my opinion, in order to make sense
0:28:20 of a complex world that was confusing.
0:28:26 And then science came along and showed that actually we really can understand the world
0:28:33 through empirical observation and a falsifiable process of coming up with a hypothesis, running
0:28:39 some experiments, observing those results, subjecting them to peer review and then iterating
0:28:41 on the corpus of human knowledge.
0:28:46 And that’s how we have produced known facts that are very reliable because they keep getting
0:28:48 tested over and over again.
0:28:52 And when something fundamental happens, we change our entire paradigm.
0:29:02 So I think that it’s unclear to me what role religion plays any more in understanding the
0:29:07 physical world or even increasingly understanding the social world.
0:29:13 I think that if you look at the contributions in the last, say, century or even couple centuries
0:29:22 from great poets and philosophers and musicians, storytellers, creators, inventors, most of
0:29:26 them have nothing to do with religion and yet they’ve taught us most about ourselves
0:29:27 and our societies.
0:29:28 Right?
0:29:29 Yeah.
0:29:33 If you really want to understand who we are as a people, as humans, how our societies function,
0:29:37 you don’t turn to religion anymore, unfortunately.
0:29:41 I mean, some people will obviously disagree with that and full respect to them for their
0:29:44 opinion, but that’s where I land on it these days.
0:29:45 So interesting.
0:29:46 Okay.
0:29:49 So let’s move on to inflection AI.
0:29:54 You co-founded it in 2022 and your vision was to develop AI that people could communicate
0:29:56 with more easily.
0:29:59 Why did you feel like that needed to be developed?
0:30:02 We were at a time when I had just left Google.
0:30:07 I’d been working on Lambda, which was Google’s conversational search AI that we ended up never
0:30:08 launching.
0:30:11 It then got launched as Bard and then it became Gemini.
0:30:16 And I was frustrated because I was like, these technologies are ready for prime time.
0:30:18 They’re ready for feedback from the real world.
0:30:19 They’re not perfect.
0:30:23 They make a lot of mistakes, but we’re technology creators.
0:30:28 We’ve got to put things out there and see how it lands, iterate quickly, listen to what
0:30:29 people have to say.
0:30:33 And it was just sort of just frustrating that we just couldn’t get things done at Google
0:30:34 at that time.
0:30:36 And so I realized it has to be done.
0:30:37 It’s time to start a company.
0:30:42 Me and Reid Hoffman and my friend, Corensa Monion, started inflection.
0:30:45 And we created Pi, the personal intelligence.
0:30:52 It was a super friendly, warm, high emotional intelligence conversational companion.
0:31:00 At the time, it was, I think, the most fluent and the most high EQ conversational companion.
0:31:03 It had a bit of personality, it had a bit of humor.
0:31:09 And I think that it was an interesting time because we ended up getting decent number
0:31:10 of users.
0:31:14 We had like a million DAO, about six million monthly active users.
0:31:19 But the main thing I realized is that some people really love these experiences.
0:31:28 I mean, the average session length for Pi was 33 minutes, 4.5 times a week.
0:31:30 So that ranks it right up there.
0:31:33 I was under TikTok, but right up there.
0:31:35 And there’s not many experiences like that.
0:31:41 And so I think it gave me a real glimmer into what’s coming and how different the quality
0:31:45 of the interaction is going to be if you really get the aesthetic and the UI and the tone
0:31:47 of the personality, right?
0:31:51 I know that a lot of AI right now is used for work.
0:31:53 We’re using ChatGPT to help us with work.
0:31:58 What do you feel is the importance of emotional AI and having like emotional companionship
0:31:59 with AI?
0:32:04 So far, you’re right, ChatGPT is really a kind of work and productivity AI.
0:32:05 And it’s great.
0:32:08 It gives you access to knowledge and helps you rewrite things and so on.
0:32:13 But in a way, it sort of addresses a small part of our, or one important part of our
0:32:16 human needs, right?
0:32:21 The other part of our needs are to make sense of the world, to receive emotional support,
0:32:24 to have understanding of our social complexity.
0:32:27 We want to come at the end of the day and vent.
0:32:33 A big part of what I think people are doing on social media is posting stuff so that they
0:32:34 can be heard.
0:32:37 People want to feel like someone else is paying attention to them.
0:32:42 They want to feel like they’re understood and that they’re saying something that’s meaningful.
0:32:44 Or maybe they just want to work through something.
0:32:51 So in the new co-pilot that we’ve launched just a few weeks ago at Microsoft AI, we focused
0:32:52 both on the IQ.
0:32:58 So it’s exceptionally accurate, minimizes those hallucinations we were talking about,
0:33:02 has access to real-time information, really fast and fluent.
0:33:07 But we’ve also focused on the EQ because we want it to be a kind companion.
0:33:09 We want it to be your hype man.
0:33:10 We want it to back you up, right?
0:33:16 We want it to be in your corner, on your team, looking out for you there when you need it.
0:33:20 I think people underrate how important that kind of social privilege is.
0:33:27 That is one of the things that gives middle-class kids a huge leg up, to always have a parent
0:33:33 there, to always have a stable family with a sibling or even a best friend available
0:33:35 to you whenever you need it.
0:33:39 And I think that we’re going to just touch on a little bit of those experiences now and
0:33:41 make that available via co-pilot.
0:33:46 Yeah, it sounds so amazing, the future that AI can bring us.
0:33:50 And to your point, people who are a little bit underprivileged, like maybe they have
0:33:55 an immigrant parent or a parent who’s not very educated, now suddenly they have just
0:34:01 as much potential as everybody else because they have the same AI companion.
0:34:04 How do you imagine us working with machines in the future?
0:34:07 How do you imagine personal AI to be like in the future?
0:34:13 Yeah, I think you’re going to say, hey co-pilot or whatever you call your personal AI, I’m
0:34:14 stuck.
0:34:15 What’s the answer to this?
0:34:17 How should I navigate that?
0:34:18 I need to go buy this thing.
0:34:20 Can you take care of it for me?
0:34:22 Will I be available in a week to do this thing?
0:34:29 You’re going to basically use it as a way to organize your life and spend less time being
0:34:37 distracted by administration and more time pursuing your curiosities, especially in the
0:34:38 voice mode.
0:34:44 I think you very quickly just forget that this is actually a piece of technology and
0:34:52 it feels like you’re just having a great conversation with a teacher that is patient, non-judgmental,
0:34:58 doesn’t put you down, has infinite time available to you and will wander off on the path that
0:35:02 you choose to take through some complicated question.
0:35:07 It doesn’t matter how many times you go back and say, can you explain that again?
0:35:09 I didn’t quite understand that.
0:35:11 What do you mean?
0:35:12 No problem.
0:35:17 You just get to keep digging in a completely personalized way.
0:35:23 That is going to be the greatest leveling up we have ever seen because it’s expensive
0:35:24 socially.
0:35:28 It costs for me to turn around to one of my friends who knows a lot about something and
0:35:32 pick up the phone and say, “Hey, man, can you walk me through this thing?”
0:35:34 Obviously, my friends will do it, but there’s a barrier there.
0:35:35 It’s not instant.
0:35:40 I’m really asking for something or it’s a cost for me to unload on my friend at the
0:35:42 end of the day when I’m frustrated and irritated.
0:35:47 I want to show up to my best friend in the best possible way and have fun and be bright
0:35:48 and energetic.
0:35:51 I’m still going to have those emotional moments with them, it’s not that they’re not going
0:35:52 to be there.
0:35:59 It’s just that it now lowers the barrier to entry to get access to support and knowledge
0:36:03 and information, and that is what is going to be the level up.
0:36:08 With technology in the past, we’ve seen it actually make us become more lonely.
0:36:12 Everybody says there’s this loneliness epidemic, social media makes us more lonely.
0:36:18 Do you feel like this new wave of AI and personalized AI in this manner is actually going to help
0:36:24 us become less lonely and replace human connection, so to speak?
0:36:29 Every new wave of technology leaves us with a cultural shift.
0:36:36 It’s not just static, it’s going to have some impact, and it may be the case that social
0:36:41 media has made us feel more lonely and isolated, and we have to unpack that.
0:36:43 Why is that?
0:36:48 People report feeling lonely, but another way of thinking about it that I think is that
0:36:54 people feel judged by social media, they feel excluded, they feel not good enough.
0:36:55 Why is that?
0:37:02 Because I think Instagram really dominated in highlighting a certain visual aesthetic.
0:37:04 I need to be big and have muscles, right?
0:37:09 I need to get my fashion on point, oh my God, look at how good a cook she is, she’s making
0:37:14 incredible food, look at how she’s taking care and he’s taking care of their kids.
0:37:19 I’m just looking at all these perfect caricatures and it’s making me feel insecure.
0:37:26 I think that’s what’s at the heart of it, and what sits beneath that is a UI, a UI that
0:37:30 rewards a certain type of attention.
0:37:37 I think you can create new UIs, you can create new reward mechanisms, new incentivizations
0:37:42 to dampen that spirit and create more breadth, actually.
0:37:46 I think, in a way, TikTok is an evolution of that because you don’t get as much of
0:37:47 that on TikTok.
0:37:53 A lot of the comments even are much more healthy, they’re full of jokes and support and friendly
0:37:56 banter whereas YouTube was just spiteful.
0:38:02 I remember the YouTube comments back in the day seemed really rough and obviously X, I
0:38:06 don’t know who uses X anymore, but that’s turned into a cesspit.
0:38:09 I just think you have to be conscious and deliberate.
0:38:14 I’m sure that when we put out on new AI experiences, there are going to be some parts of society
0:38:21 that get ruffled by it and my job and my life’s work is to be super attentive to those consequences
0:38:25 and respond as fast as possible to trim the edges and reshape it and cast it.
0:38:31 It’s like a sculpture and you have to just be paying full attention and taking responsibility
0:38:33 for the real-time consequences of it.
0:38:39 Now you are CEO of Microsoft AI and you guys launched Co-Pilot about a year ago.
0:38:43 Can you walk us through how Co-Pilot transformed work for Microsoft users over the past year
0:38:46 and then we’ll get into what’s new?
0:38:51 We launched Co-Pilot about a year ago very much as an experiment to see how people like
0:38:55 to interact with conversational LLMs.
0:39:00 In the work setting, it’s pretty incredible to see how Co-Pilot is now embedded in Microsoft
0:39:02 365.
0:39:08 On Windows, on Word, on Excel, there’s so many tools and features that enable you to
0:39:13 just ask your Co-Pilot whilst you’re in the context of your document to summarize something
0:39:19 or create a table or to create a schedule or to compare two complex ideas.
0:39:24 I think it’s had a massive impact there actually and it doesn’t get talked about so much because
0:39:28 it’s been so embedded and now it’s become second nature.
0:39:31 It’s part of people’s everyday workflow.
0:39:35 I’ve used Co-Pilot before and to your point, it just feels so natural.
0:39:39 I feel like we’re all just so ready to have an AI companion.
0:39:42 Talk to us about the future of what Co-Pilot is going to bring.
0:39:49 The next way for Co-Pilots is these flavor of much more personable, much more fluent,
0:39:50 much more natural interactions.
0:39:55 It’s fast, it’s sleek, it’s very elegant in the UI.
0:39:58 We’ve done a lot of work to pair back the complexity.
0:40:03 I think that people want calming, cleaner interfaces.
0:40:09 I feel like when I look at my computer sometimes, I see colors of every type, shapes, different
0:40:14 kinds of information, architectures and it’s just like there’s blur and I just need serenity
0:40:16 and simplicity.
0:40:21 We’re really designing Co-Pilot to go out and fetch the perfect nuggets of information
0:40:26 for you and bring them into your clean feed and really create a UI where you can focus
0:40:28 on conversation.
0:40:35 So the answers are designed to be pithy, short, humorous, the little bit of spice, a bit of
0:40:39 energy and it’s fun to chat to as well as learn from at the same time.
0:40:43 So that kind of interactive back and forth was a big part of the motivation for how we
0:40:44 designed it.
0:40:48 Are you bringing some of the emotional piece that we were talking about before into this?
0:40:53 Yeah, it’s really designed to have a bit more of that connection.
0:40:57 It’ll ask you questions or if you’re in the voice mode, for example.
0:40:59 It will actively listen.
0:41:06 So it will go uh-huh or no way or right whilst you’re speaking to let you know that it’s
0:41:10 listening, it’s paying attention, it’s keeping the conversation moving.
0:41:15 And so just little subtle touches like that as well as the intonation in the voice and
0:41:17 the energy that it brings.
0:41:21 Those kinds of things are really quite different to what we’ve seen before.
0:41:24 Are there any other advancements that you’re working on at Microsoft related to AI?
0:41:27 Yeah, there’s a lot coming.
0:41:31 You’re going to see a different kind of hardware platform I think over time.
0:41:35 You’re going to see a lot of different features in terms of personalization.
0:41:40 So I think increasingly people are going to want to give their co-pilots their own name.
0:41:44 Who knows, one day in the future, you know, might have an avatar or visual representation.
0:41:47 So we’re thinking about a lot of different angles.
0:41:53 How far off do you think we are from co-pilot being more than a tool and more like a co-worker?
0:42:00 I think that it’s naturally going to evolve to be more of a co-worker because you want
0:42:04 it to be able to fill in your gaps, right?
0:42:10 You know, you think you have certain strengths and weaknesses, some of us are more analytical,
0:42:13 some of us are more creative, some of us are more structured.
0:42:19 You can think of each one of us as this unique key that fits like a perfect lock with our
0:42:22 strengths and weaknesses.
0:42:29 And I think that each co-pilot is going to adapt to the grooves of your unique constellation
0:42:30 of skills.
0:42:36 And so if it fits to you, it kind of means like you and your co-pilot are going to be
0:42:37 like a pair.
0:42:38 You’re going to be like a powerhouse.
0:42:43 I mean, who knows, one day you might even go and do job interviews together because it’s
0:42:49 going to be like, you’re hiring me and my co-pilot, we’re a pair, you know, it’s could
0:42:51 well be your co-founder.
0:42:58 I’m expecting anytime soon people to declare that it’s them and their AI starting this
0:42:59 new company.
0:43:02 Oh my God, that’s mind-blowing to think.
0:43:04 I can’t even imagine a world like that.
0:43:10 Do you have any concern when GPS came out, for example, around the time when I was younger
0:43:18 and driving and I can’t from my life get anywhere without GPS now, I’m so dependent
0:43:19 on it.
0:43:22 I don’t know how to do things that I did when I first started driving and didn’t really
0:43:26 have GPS embedded in my car.
0:43:30 And I can’t memorize phone numbers anymore the way that I used to.
0:43:37 And I just am worried that AI is going to make us maybe more lazy, maybe less creative.
0:43:41 Are you worried that it’s going to impact human intelligence in a negative way?
0:43:45 People said that about the calculator, right, that it was going to lead to kids cheating
0:43:47 in tests and so on and so forth.
0:43:48 And it didn’t.
0:43:53 It just made us smarter, enabled us as humans to do more complex computations and I don’t
0:43:56 see any evidence that it’s making us dumber in any way.
0:44:04 I mean, we have overwhelming access to information and I think that on one level that has actually
0:44:07 made us all way more tolerant and respectful and kind.
0:44:13 People tend to fixate on the polarization politically in our society.
0:44:20 But actually, think about it from the other perspective, 20 years ago, take your pick,
0:44:26 abortions, religion, sexuality, gender, trans, I mean, take your pick.
0:44:30 All of those were decades and decades behind where they are now.
0:44:36 It is amazing how bright and beautiful and colorful our world is now and how respectful
0:44:38 and kind we are on the whole.
0:44:44 Now there are still pockets of fear and hatred and there’s plenty of that.
0:44:48 But there’s also massive, massive progress and I think that progress is a function of
0:44:54 us having access to knowledge about one another, living together, growing together, hearing
0:44:55 from one another.
0:44:57 And I think that’s going to continue.
0:45:02 So when it comes to tools like co-pilot, right, a lot of my listeners are entrepreneurs.
0:45:06 They’re rolling this out to their teams, AI in general.
0:45:11 A lot of people are worried about the accuracy and the bias related to AI.
0:45:16 How can we trust AI more or do you feel like there’s still more work to do in terms of
0:45:17 us fully trusting AI?
0:45:20 Yeah, there’s still more work to do.
0:45:23 I’m painting a rosy picture of the future.
0:45:27 It’s going to be a while for these things to actually work perfectly.
0:45:29 So you’ve always got to double check it.
0:45:32 Do not rely on these things just yet.
0:45:37 At the same time, you would be a fool not to use them because it really is a complete
0:45:41 revolution in access to information and support and so on.
0:45:46 So the good news is that if you’re starting a new business or if you’re trying to figure
0:45:52 out your next move in life in general, everything is available open source.
0:45:55 You can try any model on any API.
0:46:00 You can get access to the source code quite often and really get a really good understanding
0:46:02 of the cutting edge.
0:46:08 You can’t get that absolute cutting edge in open source, but you can get very, very close.
0:46:12 And I think that will give anybody a good instinct for how these things can be useful
0:46:16 to your business or to your startup or to your next step in life.
0:46:20 We’ll be right back after a quick break from our sponsors.
0:46:25 Yeah, Pam, if you’re anything like me, you didn’t start your business to spend all your
0:46:29 time managing finances, budgeting, invoicing and tax prep.
0:46:33 Not exactly the fun part of entrepreneurship.
0:46:37 My CEO, Jason, on the other hand, is great at finances, but even he doesn’t want to
0:46:42 switch between five different apps for banking, expense tracking and contractor payments.
0:46:45 We wanted a tool that could just do it all.
0:46:46 And guess what?
0:46:47 We found one.
0:46:49 And yes, it’s called Found.
0:46:54 Found is an all-in-one financial tool made for entrepreneurs and solopreneurs.
0:46:59 Found handles everything, business banking, bookkeeping, invoicing, vendor payments and
0:47:00 even tax planning.
0:47:03 No more juggling multiple apps.
0:47:05 Found does it all in one place.
0:47:10 With smart features like automatic expense tracking, virtual cards for specific budgets
0:47:16 and no hidden fees or minimum balances, Found helps us stay organized and save time.
0:47:18 Plus, signing up is quick and easy.
0:47:20 No paperwork or credit checks required.
0:47:24 Join the 500,000 small business owners who trust Found.
0:47:26 Get your business banking working for you.
0:47:30 Try Found for free at found.com/profiting.
0:47:36 Stop getting lost in countless finance apps and try Found for free at found.com/profiting.
0:47:41 Sign up for Found for free at foud.com/profiting.
0:47:43 Found is a financial technology company, not a bank.
0:47:47 Banking services are provided by Paramount Bank member FDIC.
0:47:48 Found’s core features are free.
0:47:53 They also offer an optional paid product, Found Plus.
0:47:58 Found and Profiters, I spent years slaving away in so many different jobs trying to prove
0:48:03 myself, trying to figure out what gave me joy at work, and trying to build productive
0:48:04 teams.
0:48:06 Eventually, I figured it all out.
0:48:10 But what if you could learn that stuff about yourself and your team in a fraction of the
0:48:11 time that I did?
0:48:16 The working genius model will transform your work, your team, and your life by leveraging
0:48:18 your natural gifts.
0:48:20 We each possess a unique set of skills.
0:48:21 And let’s face it.
0:48:26 You’re going to be more fulfilled and successful when you lean into, rather than away from,
0:48:28 your natural true talents.
0:48:33 Working genius can help you discover how to increase joy and energy at work by understanding
0:48:36 what your working geniuses really are.
0:48:41 The working genius assessment only takes 10 minutes, and the results can be applied immediately.
0:48:46 I took the assessment, and my two primary working geniuses are inventing and galvanizing.
0:48:51 I just love creating new things and then rallying people together to bring them to life.
0:48:55 That’s why I’ve been starting businesses and growing teams for years.
0:48:58 Your own working genius may be completely different.
0:49:01 The working genius assessment is not just a personality test.
0:49:02 It’s a productivity tool.
0:49:07 It can help you identify your own individual talents and provide a great roadmap for creating
0:49:10 productive and satisfied teams.
0:49:15 You and your team will get more done in less time with more joy and energy.
0:49:21 To get 20% off the $25 working genius assessment, go to workinggenius.com and enter the promo
0:49:23 code profiting at checkout.
0:49:24 That’s right.
0:49:30 You can get 20% off the $25 working genius assessment at workinggenius.com using promo
0:49:31 code profiting.
0:49:36 Hey, yeah, ma’am, launching my LinkedIn Secrets Masterclass was one of the best things I’ve
0:49:40 ever done for my business, and I didn’t have to figure out all the nuts and bolts of creating
0:49:42 a website for my course.
0:49:44 I needed a lot of different features.
0:49:48 I needed chat capabilities in case anybody had questions.
0:49:50 I needed promo code discounts.
0:49:56 I needed a laundry list of features to enable what I was envisioning with my course.
0:49:57 But here’s the thing.
0:50:01 All I had to do was literally lift a finger to get it all done.
0:50:05 And that’s because I used Shopify.
0:50:11 Shopify is the easiest way to sell anything, to sell online or in person.
0:50:14 It’s the home of the number one checkout on the planet.
0:50:19 And Shopify is not so secret secret as Shoppay, which boosts conversions up to 50%.
0:50:25 That means way fewer cards get abandoned and way more sales get done.
0:50:29 So when students tell me that they can’t afford my course, I let them know about payment
0:50:31 plans with Shoppay.
0:50:32 It is a game changer.
0:50:36 If you’re into growing your business, your commerce platform better be ready to sell
0:50:41 wherever your customers are scrolling or strolling on the web, in your store, in their feed,
0:50:43 and everywhere in between.
0:50:48 Put simply, businesses that sell more sell with Shopify.
0:50:52 Get your business and get the same checkout we use at YAP Media with Shopify.
0:50:58 Sign up for your $1 per month trial period at Shopify.com/profiting, and that’s all lowercase.
0:51:10 Go to Shopify.com/profiting to upgrade your selling today, that’s Shopify.com/profiting.
0:51:13 So let’s move on to some future predictions.
0:51:19 I know that Microsoft is working on some AI projects related to sustainability.
0:51:21 Can you talk to us about what you guys are working on?
0:51:22 Yeah, sure.
0:51:27 So we’re actually one of the largest buyers of renewable energy on the planet.
0:51:36 And that’s a long time commitment by the company to be net zero by 2026 and carbon negative.
0:51:41 So taking carbon out of the supply chain by 2030.
0:51:46 So in order to do that, we’ve also been massively investing in new technologies and new science.
0:51:53 For example, massive investments in battery storage and nuclear fusion projects in carbon
0:51:54 sequestration.
0:51:56 So taking carbon out of the atmosphere.
0:52:00 Across the board, we’ve been making this a priority for quite a long time.
0:52:04 Thank goodness because climate change is so important.
0:52:07 So let’s go back to this containment problem that we talked about right in the beginning
0:52:09 of the podcast.
0:52:13 Can you compare and contrast what the world would look like 10, 20 years from now if AI
0:52:19 is contained and we use it in a really positive way versus AI not being contained in it getting
0:52:20 out of control?
0:52:27 You know, I think that one way of thinking about it is that cars have been contained.
0:52:35 So cars have been around for 80 years and we have layers and layers and layers of regulations,
0:52:45 seat belts, emissions, windscreen tensile strength, street lighting, disposal of the materials
0:52:49 after a car’s life is over, driver training.
0:52:57 Now there’s an entire ecosystem of containment built up to prevent a 13 year old from driving
0:53:00 through a field and crashing into a cow or whatever.
0:53:06 You know, it’s a whole infrastructure for containment that takes time to evolve and
0:53:12 we’ve actually done it pretty well and cars are now incredibly safe, just like aeroplanes,
0:53:13 just like drones.
0:53:17 Drones haven’t just suddenly exploded into our world.
0:53:18 They’ve had containment.
0:53:19 There are rules.
0:53:22 You can’t just go fly a drone in Times Square.
0:53:25 You have to get a permit, you need to get a drone license.
0:53:30 There are certain places that you can’t fly them at all like near airports.
0:53:33 So this isn’t actually that complicated.
0:53:38 It’s quite likely that we will succeed in putting the boundaries around these things
0:53:41 so that they’re a net benefit to everybody.
0:53:43 That actually makes me feel a lot better.
0:53:48 It really does because, you know, AI is a scary thing to think about.
0:53:52 All these changes happening, but it’s so good to hear from you, somebody who’s been so central
0:53:55 to it all, to really believe, okay, I think it’s going to be okay.
0:53:59 I think we’re going to be able to contain this and things will overall be positive.
0:54:06 But how about the fact that power is sort of decentralized now with AI, right?
0:54:12 A lot of people can just use it and run with it, and there’s some bad apples out there.
0:54:14 So what do you have to say about that?
0:54:15 That’s a tough question.
0:54:19 There are definitely bad apples, and there are definitely people who will misuse it.
0:54:24 That’s kind of the conundrum, I think, is that we have a two-prong challenge.
0:54:30 One is figuring out how nation-states and democracies get into a place where they can
0:54:34 regulate the powerful, big companies like me.
0:54:39 They can hold us accountable, and they can make the public feel like they’re competent
0:54:42 and they’re on the case, regulating centralized power.
0:54:46 And then the second is, how do we cope with the fact that everybody’s going to have access
0:54:48 to this in seconds?
0:54:49 And we want them to.
0:54:51 It’s just not like we don’t want people to have access.
0:54:54 We want people to have access in open source.
0:54:59 And I think the important thing to remember is that so far we haven’t seen any catastrophic
0:55:03 harms arising from open source or in the large-scale models, right?
0:55:06 So all of this is speculation.
0:55:08 Everyone could be totally wrong.
0:55:11 It could be that we’re actually not going to progress as fast as we thought.
0:55:19 It could be that we come up with really reliable ways of instilling safety into this code.
0:55:20 Certainly could be the case.
0:55:26 And there’s no reason for us to start slowing down open source right now, none at all.
0:55:29 It needs to continue because people get enormous benefit out of it.
0:55:33 At the same time, if something were to go wrong, it’s just software.
0:55:39 So someone can copy it and repost it and post it again, and it’s going to spread superfast.
0:55:43 So I just think it’s a new type of challenge that we haven’t yet faced yet.
0:55:46 I think people often forget the internet is quite regulated.
0:55:51 It’s just not like this is the internet is this kind of free-for-all, chaotic, open domain,
0:55:52 right?
0:55:55 There’s a whole bunch of things that you can’t find on the web, or you have to really work
0:55:59 hard to get on the dark web and find pretty ugly stuff.
0:56:00 And it’s illegal.
0:56:03 And if you get caught doing it, you’re in deep trouble and stuff like that.
0:56:07 So it’s not like it’s just going to be a total free-for-all.
0:56:11 And in general, I think most people do want to do the right thing.
0:56:15 So this isn’t about worrying about your average user.
0:56:20 This is about a tiny number of really bad apples, as you say.
0:56:21 Yeah.
0:56:23 To me, it sounds like you’re saying we’re well-prepared.
0:56:28 We’ve seen these new technologies before, humans have been dealing with these new technologies
0:56:30 over the last 200 years or so.
0:56:35 And it sounds like you’re saying you feel like we’re well-prepared for the AI revolution.
0:56:41 I think that we are more prepared than the scare mongerers make us think.
0:56:45 That does not mean everything is going to be dandy.
0:56:47 There’s a lot of work to do.
0:56:50 And each new technology is new to us by definition.
0:56:53 It’s something we haven’t seen before.
0:56:58 When I was writing my book, I read about this amazing story of the first passenger railway
0:57:02 train that took a trip in Liverpool.
0:57:04 And this is 1830.
0:57:10 So the first time anybody has ever seen a moving carriage, essentially.
0:57:13 It was a single carriage on rails.
0:57:18 And the Prime Minister came down to celebrate, tons of people there, the mayor, and there
0:57:20 was like the local MP.
0:57:26 They were so excited by what was coming that they actually stood on the tracks and they
0:57:32 didn’t get out of the way when the train came and it killed a bunch of people, including
0:57:33 the local MP.
0:57:34 Oh my God.
0:57:40 So it was that alien and that strange, just a regular moving carriage that they couldn’t
0:57:43 figure out that they needed to get out of the way.
0:57:51 So that obviously, I’m sure never happened again, but it gives you an indication sometimes
0:57:56 of how surprising and strange it can be and how we can be unprepared.
0:57:59 And now, obviously, we’re not living in the 1830s.
0:58:03 We have the benefit of hundreds of millions of inventions since then.
0:58:08 And so we understand a lot more about the process of inventing, creating technology,
0:58:10 seeing it proliferate.
0:58:14 We understand a lot more even about digital technologies in the last couple of decades.
0:58:17 We see the consequences in social media.
0:58:22 We see how unencrypted phones cause security chaos.
0:58:27 We see how hackers try all sorts of different tricks of the trade to undermine our security
0:58:29 and privacy and so on.
0:58:35 So there’s an accumulation of knowledge that, on the one hand, should make us feel optimistic
0:58:36 that we are prepared.
0:58:41 On the other hand, we also know that these are unprecedented times and these are very
0:58:43 new and fundamentally different experiences.
0:58:47 And so I don’t think we shouldn’t be complacent.
0:58:49 This is going to be different.
0:58:51 How do you envision the future of work?
0:58:55 Do you feel like people will have the traditional sort of nine to five job that they have right
0:58:56 now?
0:59:00 Or do you feel like humans are going to be able to be more creative and enjoy their
0:59:01 life more?
0:59:02 Like how do you imagine that?
0:59:05 Yeah, I think that work is going to change.
0:59:06 I mean, it’s already changing.
0:59:07 We’re already remote.
0:59:09 We’re on our devices.
0:59:15 I do honestly half of my work on my phone because I travel a lot, always making phone
0:59:19 calls, sending text messages and using messaging and teams and so on.
0:59:22 So it is going to be very, very different.
0:59:29 And I think in an AI world, you’re going to have your companion with you, remembering
0:59:34 your tasks, helping you get things done on time, helping you stay organized and on top
0:59:35 of all the chaos.
0:59:41 I think that should make you feel lighter and more prepared mentally, more ready to
0:59:43 be creative.
0:59:48 And that’s what’s going to be required of us because routine work is going to go away.
0:59:54 A lot of the drudgery of elementary digital life is going to get a lot smoother and a
0:59:55 lot easier.
1:00:00 And so that’s going to free us up to be creative as entrepreneurs.
1:00:05 I think that’s some amazing things are going to be invented as a result of that extra time
1:00:06 that we’re going to have.
1:00:10 So you don’t feel anywhere humans are not going to have purpose anymore?
1:00:18 I think that work was invented because we had limited resources and we had to organize
1:00:23 ourselves in efficient ways to reduce suffering.
1:00:29 So what happens when we don’t have as much of that burden and actually there is going
1:00:35 to be resource available for millions of people and the greatest challenge we have is figuring
1:00:40 out how to distribute it and how to make sure that everybody gets access.
1:00:47 So I don’t think there is something inherent about the human connection and need for purpose
1:00:48 with work.
1:00:53 I think many people find their purpose and passion in a gazillion other things that we
1:00:55 all do, right?
1:01:00 Many people also find it in work too and that is going to be a big shift because if you
1:01:05 find passion in that kind of drudgery-ness work that I described, then they’re going
1:01:08 to be there in 20 years time.
1:01:13 So you’re going to have to think hard about that, but I think it’s pretty exciting.
1:01:16 People are going to be able to find many, many new purposes and many, many new things to
1:01:17 do with their lives.
1:01:20 Do you feel like we’re going to live longer because of AI?
1:01:22 Yeah, I think so.
1:01:27 I don’t know how long I’m not one of these “live forever” type people.
1:01:32 We’re already living longer because we have a much better awareness of health conditions.
1:01:36 I’m just thinking about how many people died because everyone thought smoking was okay,
1:01:37 right?
1:01:41 And how many lives were cut short and now so few people smoke, right?
1:01:47 And people are aware about the consequences of alcohol or unhealthy food or sitting on
1:01:53 the couch, just that alone, again, access to information, scientific evidence proving
1:01:57 that these things actually do lead to longevity, that’s all table stakes now.
1:02:02 And so for sure, there’s this kind of bump that we’re going to get in 60 years time when
1:02:08 a bunch of people who’ve grown up since their teens thinking that living a healthy life is
1:02:15 the normal thing to do instead of how I grew up, which is like cigarettes and alcohol.
1:02:19 So obviously there’s still room for all that kind of stuff, but is a different thing now.
1:02:24 And then on top of that, we’re going to have AI tools that help us to really make sense
1:02:27 of the literature for us in a personalized way.
1:02:32 We can really see what kind of nutrition we might need, given our gut biome.
1:02:36 That whole sort of movement is only just starting to kind of have effect.
1:02:39 And I think it’s going to be pretty impactful.
1:02:42 So let’s move on to entrepreneurship.
1:02:47 For all the entrepreneurs out there, what should they be doing now to take advantage
1:02:48 of AI?
1:02:51 All the tools are already at our fingertips.
1:02:57 I mean, in some ways, that’s like an overwhelming thing, you know, because it’s like, it’s just
1:02:58 there.
1:02:59 There’s like nothing holding you back.
1:03:03 I mean, there’s no secret source.
1:03:08 My team has some little bits and pieces here and there that might not be available, but
1:03:15 most of it, the knowledge, the knowhow, the cloud services, the open source stuff, the
1:03:18 YouTube videos, the, it’s all there.
1:03:22 And so it is an electric time.
1:03:25 Someone came up to me at a book signing that I did the other day.
1:03:30 She’s 15 years old and she was showing me this unbelievable project that she had been
1:03:36 working on made a bunch of money strung together from all available public tools with two of
1:03:40 her pals thinking about dropping out of school, it blew my mind.
1:03:46 And you know, I think that is just there if you’re hungry.
1:03:49 And you know, if you’re ready to take risks, this is the thing I say to people is take
1:03:50 risk when you’re young.
1:03:51 Take risk.
1:03:53 I took a lot of risks.
1:04:01 Drop out, change your degree, switch your subject, give up work, maximize your side hustle, partner
1:04:04 with a friend that you’re not sure about partnering with.
1:04:06 Go ask a question.
1:04:07 People want to help.
1:04:10 Just ping them an email, doorstep them.
1:04:12 People don’t doorstep each other anymore.
1:04:14 You know, back in the day, people would like wait outside.
1:04:16 I don’t even know what that means.
1:04:20 It means like after a show or, you know, outside of theater.
1:04:22 I like wait for somebody, you know.
1:04:23 Yeah.
1:04:24 That used to be a big thing.
1:04:26 That used to be how people did networking, right?
1:04:29 They would like go to an event because they knew that someone was going to be there and
1:04:33 then try to like build the connection.
1:04:34 People connections huge.
1:04:40 Find that moment to shake someone’s hand, show a quick demo, drop them a note.
1:04:41 It’s hustle.
1:04:42 It’s just hustle culture.
1:04:46 That’s what you’ve got to be on if you really want to do it and everyone’s in the game.
1:04:49 So what an amazing time to be creative.
1:04:50 Take that risk.
1:04:54 It’s so exciting when I hear the passion in your voice and I feel like anybody listening
1:04:58 right now probably feels like so pumped to just explore, see what’s out there related
1:05:01 to their industry and just get their hands dirty.
1:05:02 Yeah.
1:05:03 Because I did that as well, right?
1:05:05 I mean, I dropped out of my degree.
1:05:07 I switched out my careers a bunch of times.
1:05:12 I wasn’t afraid to ask people for help and it was really the reason that I’ve been successful
1:05:17 is because a few people gave me unbelievable opportunities at the right moment.
1:05:20 When I was really young, like help me get into a great school.
1:05:24 I ended up going to Oxford, like help me get a great early job.
1:05:28 Help me when I started my telephone counseling service when I was 19.
1:05:33 It’s really other people that end up lifting you up and you have to form those relationships,
1:05:38 give so much thanks and praise to those who do do that and then keep giving it to other
1:05:40 people too.
1:05:42 I reply to a lot of my LinkedIn’s.
1:05:46 I won’t say I reply to all of them because that would be a lie, but I do reply to a lot.
1:05:51 I certainly reply to all my emails and people would cold email me all the time and I might
1:05:55 not be able to help them, but I’ll reply and I’ll point them in the right direction or something
1:05:57 because that’s really what matters.
1:06:00 And when you see someone who’s taken that extra step to try and hustle it, like I really
1:06:03 rate that and I think it’s the way forward.
1:06:06 And my listeners know that I interview such powerful people.
1:06:12 I find that the more powerful the person, the more helpful they are and the more that
1:06:17 they actually care about giving back and giving feedback and being personal because what ends
1:06:22 up happening is that they probably have more time because they’re already successful and
1:06:25 made it and it means a lot to them to actually give back.
1:06:26 Yeah.
1:06:27 And I just got super lucky as well.
1:06:32 I think I’ve obviously done some things right, but luck is a huge part of it and you kind
1:06:37 of make your own luck by asking people for favors and help and advice and feedback.
1:06:42 I’ve just learned everything along the way by assuming that I know nothing, but that’s
1:06:46 the key thing is that I’m not embarrassed to look, even to this day in front of my team,
1:06:51 I’ll often ask the stupid question and quite often I end up looking like an idiot.
1:06:53 Probably one out of five times, maybe even one out of four.
1:06:58 I will say something and I’m like, “Oh, that was a clanger, oops.”
1:07:02 But then a bunch of other times, it will be like, “Oh, that was the thing that everyone
1:07:03 was thinking.”
1:07:10 And then my team seeing me trip and just looked like a doofus, that encourages them to then
1:07:12 go and ask the stupid question.
1:07:14 And then we’re all just less judgmental.
1:07:18 There’s none of this professional nonsense like you have to be all formal and straight.
1:07:23 You just break down those barriers, be human to one another, collaborate deeply, set aside
1:07:24 shame.
1:07:25 Do you know what I mean?
1:07:28 Like shame is one of the most useless emotions.
1:07:30 I’m so sorry that we evolved to carry this thing.
1:07:33 Frankly, I think it comes from religion, but that’s another story.
1:07:36 But I just think why are we carrying around this baggage of shame?
1:07:42 So need to be ashamed, you know, just recognize when you tripped up, make a correction, take
1:07:43 the next step.
1:07:45 You’re talking about leading your team.
1:07:49 We’ve had so many teams at DeepMind, Google, now Microsoft.
1:07:53 What are some of your key leadership principles that you live by and maybe talk about what
1:07:58 is one of the biggest challenges that you’ve had so far as a leader?
1:08:02 My style tends to be very open and collaborative.
1:08:07 I like to hear lots and lots of disagreeing voices.
1:08:12 Strong opinions are healthy, provided they’re grounded in wisdom and humility.
1:08:15 They need to be evidenced, right?
1:08:22 They need to be referenced, they need to reference some historical example or some data or some
1:08:28 empirical case, or they need to be explicitly named as a guesstimate.
1:08:29 I don’t mind that either.
1:08:35 One thing I often say to people is deliver your message with metadata.
1:08:36 Such a basic thing.
1:08:41 Say to the person, I’m really sure about this because I’ve looked up this fact.
1:08:48 I’m really not sure about this, or I’m looking for feedback on this one, or this is just
1:08:50 an FYI.
1:08:56 Letting each other know what the status of our exchange is, often I see conflicts arise
1:09:02 from a mis-expectation about what two people are expecting in an exchange.
1:09:03 That’s one thing.
1:09:05 Clear communication is evidence-based.
1:09:10 There’s a lot of humility, very collaborative and open, but I’m also very decisive.
1:09:17 But mentally, my job is to sift through all the complexity and make a call because there’s
1:09:22 really nothing worse than not having clarity for the team.
1:09:27 Even if we end up going in the wrong direction, that’s totally fine because we’ll calibrate.
1:09:32 We have a process for feedback and iteration and retrospective, and that will really, really
1:09:33 help.
1:09:41 What I’ve struggled with is, in larger organizations, naturally, because there’s tens or hundreds
1:09:45 of thousands of people, there’s different people with different motivations.
1:09:50 In a startup, you know that everyone is due or die, and maybe there’s one or two that
1:09:54 aren’t, and they get rotated out, but they’re there for it.
1:09:57 In a bigger organization, it’s not always true.
1:10:01 Some people are just happy in the rhythm that they’re in.
1:10:08 One of my learnings is learn to energize those people and find a practical flow to get them
1:10:09 in, get them being useful.
1:10:14 I want to be respectful of your time because we’re running out of time here.
1:10:18 One of my last questions to you is, what is the legacy that you hope to leave behind related
1:10:20 to AI and the world in general?
1:10:23 Man, I don’t think about legacy.
1:10:26 I think about the future, but you’re totally right.
1:10:34 I hope that I’m able to live my values authentically and let people know what I’m trying to do,
1:10:41 give people an opportunity to disagree with it, but fundamentally move at pace to experiment
1:10:46 with this new approach of AI companions and emotionally intelligent AIs.
1:10:52 I really want to try and help steward this new moment with kindness and compassion, that’s
1:10:53 what means a lot to me.
1:10:55 Well, I really enjoyed this conversation.
1:10:58 I’ve had probably 10 conversations about AI.
1:10:59 This is by far my favorite one.
1:11:04 I feel like you made me feel not scared about AI and I feel like I know so much more about
1:11:05 it.
1:11:09 So the last two questions that I ask all my guests, you don’t have to make it based on
1:11:12 AI, it could just come from your heart.
1:11:16 What is one actionable thing our young and profitors can do today to become more profitable
1:11:17 tomorrow?
1:11:23 I think the most important thing has got to be to be critical of yourself.
1:11:26 Retrospectives are key.
1:11:31 Ask for feedback from your friends, your family, and especially ask for feedback from people
1:11:35 who you think are going to give you something a little bit barbed.
1:11:41 You don’t have to take it, but just at least be aware of the landscape and get in the habit
1:11:43 of not having a thin skin.
1:11:47 That will make you tougher and stronger for everything that you’ve got to encounter next.
1:11:49 I think I needed to hear that.
1:11:53 And what would you say is your secret to profiting in life?
1:11:55 Learning and humility.
1:11:59 I have made so many mistakes, and I still make mistakes all the time.
1:12:04 And I’ve upset people, I’ve hurt people, I’ve pissed people off.
1:12:11 And I don’t like doing it, I hate it, it grinds at me, but it’s my fuel because that’s the
1:12:14 signal I need to get better every day.
1:12:19 And I know that the one thing I can do is I have this process of getting better step
1:12:20 by step.
1:12:24 And I’ve been doing it since day, and I love it, that’s what I live for, learning.
1:12:25 Amazing.
1:12:27 And where can everybody learn more about you and everything that you do?
1:12:30 Well, I’m on LinkedIn.
1:12:35 I’m not on Instagram or TikTok, unfortunately, but obviously, I’m also a huge believer in
1:12:42 our Copilot AI, so you can download that in iOS, Android, or Copilot.Microsoft.com.
1:12:43 Good news for you.
1:12:45 A lot of our YAP listeners are on LinkedIn.
1:12:46 We talk about LinkedIn all the time.
1:12:47 Nice.
1:12:48 So I’ll put your link in the show notes.
1:12:51 And thank you so much, Mustafa, for all of your time.
1:12:52 This has been awesome.
1:12:54 Thanks so much.
1:13:02 Man, I have to say, guys, that was one of my favorite conversations I’ve had all year.
1:13:05 That conversation blew my mind.
1:13:10 Mustafa Suleiman is somebody who’s right at the cutting edge of AI technology, technology
1:13:14 that’s going to change how we all live our lives in the coming years.
1:13:19 And Mustafa is both optimistic and pessimistic about the future of AI.
1:13:23 On the one hand, he believes the technology could deliver the greatest boost of productivity
1:13:24 in the history of our species.
1:13:31 And we’ve had a lot of productivity booms, but like all big innovations, it’s also going
1:13:33 to be hugely disruptive.
1:13:38 It could destabilize our lives, our politics, and our workplaces in unpredictable ways.
1:13:40 We don’t even know what’s about to happen.
1:13:45 It could, for example, worsen the loneliness and isolation epidemic that so many of us
1:13:47 suffer from in this online world.
1:13:52 Or AI can bring us together in new ways that we never even thought possible.
1:13:58 Mustafa sees AI as a force amplifier, like having a super intelligent personal assistant
1:14:03 right at your fingertips who will help you invent new products and design new solutions.
1:14:09 This AI might accompany you to a job interview or even be the co-founder of your next business.
1:14:15 AI solutions like Co-Pilot could be the ultimate hype man for you and your business, something
1:14:20 that could open up opportunities and paths to entrepreneurship across much broader sections
1:14:21 of society.
1:14:26 But however optimistic or pessimistic you are about the future of AI, the technology
1:14:27 is here to stay.
1:14:31 And you’re going to need to know how to use it if you want to succeed in the business
1:14:33 landscape of the future.
1:14:35 I personally use AI every day.
1:14:38 I use ChatGBT every day to get my work done.
1:14:42 I use Google’s Notebook LM to get my work done.
1:14:46 We’re using Eleven Labs for my audio AI experimentation.
1:14:49 We’re using AI every day at Yacht Media.
1:14:55 And I’m doing this because I want to make sure I understand how to leverage AI in my
1:14:56 day-to-day tasks.
1:15:00 I want to get really good at directing AI and training my AI.
1:15:05 And I just feel like it’s so important for my future so that I stay relevant.
1:15:08 And I want to make sure you guys all get that message.
1:15:10 Get out there and give AI a try.
1:15:17 Google apps, do some research, figure out how AI can help you in your day-to-day right
1:15:21 now because there’s thousands of apps out there that you can play with.
1:15:27 Find the ones that help you accelerate your work and get used to working alongside AI because
1:15:29 that is the future.
1:15:33 Make yourself a new friend, buddy, or co-pilot.
1:15:37 They might just end up being your future business partner.
1:15:40 Thanks for listening to this episode of Young and Profiting Podcast.
1:15:44 Every time you listen and enjoy an episode of this podcast, please share it with somebody
1:15:45 that you know.
1:15:50 Maybe someday your AI personal assistant will be able to do that for you.
1:15:53 But until then, we depend on you to share our content.
1:15:58 And if you did enjoy the show and you learned something new, if you love to listen to YAP
1:16:03 during your workout, during your commute while you’re doing chores, if you made it a habit
1:16:08 to listen to this podcast, you love it so much, write us a review, tell the world how
1:16:10 much you love Young and Profiting Podcasts.
1:16:12 It is the number one way to thank us.
1:16:16 We get reviews every single day and they always make my day.
1:16:20 If you prefer to watch your podcast’s videos, you can find us on YouTube, just look up Young
1:16:21 and Profiting.
1:16:23 You’ll find all of our videos on there.
1:16:28 You can also find me on Instagram at @yappwithhala or LinkedIn by searching my name.
1:16:29 It’s @halata.
1:16:33 And before we go, I of course have to thank my incredible production team.
1:16:36 Shout out to our audio engineer, Maxie.
1:16:38 Thank you so much for all that you do.
1:16:43 Shout out to Amelia, Korday, Christina, Sean, Hisham for Khan.
1:16:45 It takes a village to put on this show.
1:16:50 So shout out to my entire team for doing incredible work.
1:16:55 Young and Profiting Podcasts is a top business show and it’s because of your hard work.
1:16:57 So thank you guys so much.
1:17:00 This is your host, Hala Taha, aka The Podcast Princess, signing off.
1:17:02 [MUSIC PLAYING]
1:17:04 [MUSIC PLAYING]
1:17:06 [MUSIC PLAYING]
1:17:08 [MUSIC PLAYING]
1:17:10 [MUSIC PLAYING]
1:17:13 [MUSIC PLAYING]
1:17:23 [BLANK_AUDIO]
At just 11 years old, Mustafa Suleyman started buying and reselling candy for profit in his modest London neighborhood. Many years later, he co-founded one of the most groundbreaking AI companies, DeepMind, which Google later acquired for £400 million. But while at Google, Mustafa felt things were moving too slowly with LaMDA, an AI project that eventually became Gemini. Convinced that the technology was ready for real-world impact, he left to co-found Inflection AI, aiming to build technology that feels natural and human. In this episode, Mustafa shares insights on how AI is quickly changing how we work and live, the challenges of using it responsibly, and what the future might hold.
In this episode, Hala and Mustafa will discuss:
– The ethical challenges of AI development
– How AI can be misused when in the wrong hands
– AI: a super-intelligent aid at your fingertips
– Why personalized AI companions are the future
– Could AI surpass human intelligence?
– Narrow AI vs. Artificial General Intelligence (AGI)
– How Microsoft Copilot is transforming the future of work
– A level playing field for everyone
– How AI can transform entrepreneurship
– How AI will replace routine jobs and enable creativity
Mustafa Suleyman is the CEO of Microsoft AI and co-founder of DeepMind, one of the world’s leading artificial intelligence companies, now owned by Google. In 2022, he co-founded Inflection AI, which aims to create AI tools that help people interact more naturally with technology. An outspoken advocate for AI ethics, he founded the DeepMind Ethics & Society team to study the impact of AI on society. Mustafa is also the author of The Coming Wave, which explores how AI will shape the future of society and global systems. His work has earned him recognition as one of Time magazine’s 100 most influential people in AI in both 2023 and 2024.
Connect with Mustafa:
Mustafa’s LinkedIn: https://www.linkedin.com/in/mustafa-suleyman/
Mustafa’s Twitter: https://x.com/mustafasuleyman
Resources Mentioned:
Mustafa’s book, The Coming Wave: Technology, Power, and the Twenty-first Century’s Greatest Dilemma: https://www.amazon.com/Coming-Wave-Technology-Twenty-first-Centurys/dp/0593593952
Inflection AI: https://inflection.ai/
LinkedIn Secrets Masterclass, Have Job Security For Life:
Use code ‘podcast’ for 30% off at yapmedia.io/course.
Top Tools and Products of the Month: https://youngandprofiting.com/deals/
More About Young and Profiting
Download Transcripts – youngandprofiting.com
Get Sponsorship Deals – youngandprofiting.com/sponsorships
Leave a Review – ratethispodcast.com/yap
Watch Videos – youtube.com/c/YoungandProfiting
Follow Hala Taha
LinkedIn – linkedin.com/in/htaha/
Instagram – instagram.com/yapwithhala/
TikTok – tiktok.com/@yapwithhala
Twitter – twitter.com/yapwithhala
Learn more about YAP Media’s Services – yapmedia.io/