Category: Uncategorized

  • #60 Jim Dethmer: Leading Above the Line

    Jim Dethmer, founder of The Conscious Leadership Group shares practical advice about becoming more self-aware, ditching the victim mindset, and connecting more fully with the people in our lives. 

     

    Go Premium: Members get early access, ad-free episodes, hand-edited transcripts, searchable transcripts, member-only episodes, and more. Sign up at: https://fs.blog/membership/

     

    Every Sunday our newsletter shares timeless insights and ideas that you can use at work and home. Add it to your inbox: https://fs.blog/newsletter/

     

    Follow Shane on Twitter at: https://twitter.com/ShaneAParrish

     

  • a16z Podcast: The Politics of Technology

    AI transcript
    0:00:04 The content here is for informational purposes only, should not be taken as legal business
    0:00:10 tax or investment advice or be used to evaluate any investment or security and is not directed
    0:00:14 at any investors or potential investors in any A16Z fund.
    0:00:18 For more details, please see a16z.com/disclosures.
    0:00:21 Hi everyone, welcome to the A6NZ podcast.
    0:00:22 I’m Sonal.
    0:00:26 I’m here today with a very special guest visiting Silicon Valley, the former prime minister
    0:00:32 of the United Kingdom, Mr. Tony Blair, who now runs an institute for global change working
    0:00:35 with governments, policymakers and others all around the world.
    0:00:39 Also joining us, we have Andreessen Horowitz, managing partner Scott Cooper, who has a new
    0:00:44 book just out called Secrets of Sandhill Road, Venture Capital and How to Get It.
    0:00:49 And given that startups are drivers of economic growth and innovation, Cooper also often weighs
    0:00:54 in on various policy issues, especially those that affect the flow of capital, people and
    0:00:55 ideas around the world.
    0:00:58 And that’s the focus and theme of this episode.
    0:01:02 It’s more of a hallway-style conversation where we invite our audience to sort of eavesdrop
    0:01:04 on internal meetings and convos.
    0:01:09 We discuss the intersection of governments and technology and where policy comes in,
    0:01:13 focusing mainly on the mindsets that are required for all of this.
    0:01:17 But then we do also suggest a few specific things we can do when it comes to supporting
    0:01:21 tech change for the many, not just for the few.
    0:01:22 Welcome Tony.
    0:01:23 Thank you.
    0:01:24 Did you ask me to call you that?
    0:01:25 Thank you.
    0:01:26 Can everyone know?
    0:01:27 He said it was okay.
    0:01:28 And Cooper, welcome.
    0:01:29 Thank you.
    0:01:30 So let’s just get started.
    0:01:34 I think the context is that there’s so much discussion right now about tech in the context
    0:01:35 of inequality.
    0:01:40 One of the points of view that I have, particularly coming from a background where my family came
    0:01:44 from India, et cetera, is that it’s also very democratizing.
    0:01:49 And a lot of people can do new things in better ways because of technology.
    0:01:53 But I think the big question, the question I think we care about today is how do we bring
    0:01:58 more people into the system and make sure that tech benefits everyone?
    0:02:03 The first thing I would say from the political perspective is that technology is essentially
    0:02:04 an empowering and enabling thing.
    0:02:08 So I regard it as benign, but it’s got vast consequence.
    0:02:10 So the question is how do you deal with the consequence?
    0:02:14 How do you access the opportunities and mitigate its risks and disbenefits?
    0:02:17 So that is, I think, the right framework to look at it.
    0:02:24 But because it’s accelerating in its pace of change and because the change is so deep,
    0:02:28 and I look upon this technological revolution as like the 21st century equivalent of the
    0:02:32 19th century industrial revolution, it’s going to transform everything.
    0:02:37 So I think the fundamental challenge is that the policy makers and the change makers are
    0:02:39 not in the right dialogue with each other.
    0:02:44 And this is where misfortune will lie if you end up with bad regulation or bad policy and
    0:02:49 where the tech people kind of go into their little huddle, because I say this with great
    0:02:55 respect, but you come to this Silicon Valley and it’s like walking into another planet,
    0:02:56 frankly.
    0:02:57 Yes, that’s actually really interesting.
    0:03:00 I’m personally offended by that comment.
    0:03:03 Now I think the difficulty is that, yes, you’re right, it’s very empowering.
    0:03:07 On the other hand, it’s actually quite frightening to people because you kind of all understand
    0:03:11 it and the rest of the world doesn’t quite understand it.
    0:03:16 And as far as they do understand it, they find it somewhat dystopian and it’s look.
    0:03:19 I was actually sitting with some people from my old constituency in the north of England
    0:03:24 a few months back and I said to them, I wonder what’s going to happen when we have driverless
    0:03:28 cars and their attitude was, it’s never going to happen.
    0:03:34 And the role of a politician is to be able to, in a sense, articulate to the people those
    0:03:37 changes and then fit them into a policy framework that makes sense.
    0:03:41 And that’s the worry because if the politicians don’t understand it, they’ll fear it.
    0:03:43 If they fear it, they’ll try and stop it.
    0:03:46 You articulated the vision that we’ve always had, which is we’ve always invested around
    0:03:48 this theme called software is eating the world.
    0:03:53 It’s exactly what you describe, which is technology no longer kind of sits in its own box.
    0:03:57 It really is the case that technology will permeate almost every industry over time.
    0:04:01 I think that’s where the big change is happening now is it used to be that technology was a
    0:04:02 piece of the puzzle now.
    0:04:04 Every company is a technology company.
    0:04:05 Yeah, exactly.
    0:04:07 So that is the kind of board on which people are playing.
    0:04:11 So the issue, I think, is this, is how do you get the structured dialogue between the
    0:04:13 change makers and the policy makers?
    0:04:16 What would you say the number one thing if you could give advice to entrepreneurs in
    0:04:21 the Valley that they should do differently to engage this kind of a framework that you’re
    0:04:22 describing?
    0:04:26 My advice would be stop looking at your own narrow interest in what you’re doing and
    0:04:31 understand you’ve got a collective interest in making the world of policy and politics
    0:04:33 understand the technology.
    0:04:34 What’s going to happen?
    0:04:41 How you get A, the right system of regulation and B, how you allow government to enable
    0:04:43 these transformative changes.
    0:04:44 Yes.
    0:04:45 Well, I actually have a question for both of you.
    0:04:48 Today is the 30th anniversary of the web, the World Wide Web.
    0:04:52 And I just had Google Doodle this morning and a note from Tim Berners-Lee, his original
    0:04:55 memo, information management, colon, a proposal.
    0:04:59 The question I have is that a lot of people would argue that the best technologies can
    0:05:04 develop when you don’t try to, A, priority predict the consequences because, A, you cannot
    0:05:09 their complex adaptive systems and, B, there was an environment of quote permissionless
    0:05:13 innovation that allowed the web to thrive because the original makers may have foreseen
    0:05:17 some apps, but the whole point is that the innovation is what allowed it to thrive.
    0:05:20 So I’d love to hear from both of you on how to balance that perspective.
    0:05:21 So I agree with that.
    0:05:25 I think though what’s different is we used to be able to compartmentalize technology.
    0:05:29 It was a piece of software that you used at work to help you be more productive.
    0:05:33 But if technology really is going to be part and parcel of everything, then I think it
    0:05:37 changes the nature of how we think about that responsibility because it is regulated industries
    0:05:42 in many cases that have been largely immune over time from technology in a way that appears
    0:05:43 to be different today.
    0:05:45 So I would say that then there are two questions that derive from that.
    0:05:48 One is how do you make regulation intelligent?
    0:05:53 How do you make it abide by the public interest or enhance the public interest, but at the
    0:06:00 same time not dampen that creative and in a way entrepreneurial drive behind the development
    0:06:01 of new ideas?
    0:06:06 And then secondly, what are the ways that government should be working with those that
    0:06:08 are going to be impacted by technology?
    0:06:11 If you’re in the car industry, it’s going to be a huge change, right?
    0:06:15 I mean, if you get these driverless cars, it’s going to change obviously jobs.
    0:06:17 It’s going to change insurance.
    0:06:22 It’s going to change the method of production, what you produce, probably change the concept
    0:06:23 of car ownership in some way.
    0:06:24 Absolutely.
    0:06:25 It might even reshape entire cities.
    0:06:26 Everything will be impacted by it.
    0:06:27 Exactly.
    0:06:32 I am fascinated by the potential of technology to allow African nations and governments to
    0:06:36 circumvent some of the legacy problems we have within our systems.
    0:06:41 And that goes for everything from basic healthcare and education through to how you help agricultural
    0:06:45 small holders develop a better yield, cooperate better together, and link up better with the
    0:06:46 market.
    0:06:49 And in fact, one thing that’s happening in Africa today is there are applications of
    0:06:52 tech that are growing up in interesting ways.
    0:06:55 So my point is, you’ve got all these different facets.
    0:07:00 And yet at the moment, the curious thing is, if you were to go to virtually any European
    0:07:06 country or if you were to come here and say, okay, name the top four issues, where would
    0:07:09 technology be in that list?
    0:07:10 Would it be at the top of the list is what you’re saying?
    0:07:12 No, I think it wouldn’t be in the list.
    0:07:16 Cooper, you spent a lot of time actually in your role as a partner in front of Congress
    0:07:20 and various entities giving testimonies about policy and curious for your take on this.
    0:07:21 Yeah.
    0:07:25 There is a concern I often hear from entrepreneurs, which is, how do we know if we go there?
    0:07:29 How does that not just bring us into the fold of regulation and therefore have negative
    0:07:32 consequences versus, you know, we talk about things out here.
    0:07:35 Sometimes you do things you ask for permission later is a better strategy.
    0:07:36 Right.
    0:07:37 Ask for forgiveness.
    0:07:38 Ask for permission.
    0:07:39 Yeah.
    0:07:40 I complete to get that.
    0:07:44 That’s why I think it’s got to be a big, it’s got to be done in a big way from the collectivity
    0:07:47 rather than individual people going because of course you’re absolutely right what will
    0:07:50 happen is that the entrepreneurs think, okay, if I go and say, I’ve got the following five
    0:07:53 problems that I can see in this technology I’m developing, they’re going to regulate
    0:07:54 it away.
    0:07:55 Yeah.
    0:08:01 I think the hard question will be you’re getting people and companies that will enormous power.
    0:08:06 I mean, not just the big tech players, but the others as well.
    0:08:11 So I think one of the things that in a sense, it’s my question to you is, how do you manage
    0:08:18 to get into that dialogue with policymakers where, you know, these very powerful people
    0:08:23 recognize that in the end, you know, however powerful they are, they are not more important
    0:08:25 than the public interest.
    0:08:28 Part of we believe our role is to help provide, you know, visibility.
    0:08:31 I wouldn’t, I don’t want to say education because I think politicians are very well
    0:08:35 educated and certainly well meaning, but to connect the divide between, in our case, DC
    0:08:36 and Silicon Valley.
    0:08:40 And so we will often reach out to regulators, legislators and help them understand this is
    0:08:43 what’s happening from an innovation perspective and therefore these are things that you might
    0:08:45 want to anticipate that you need to think about.
    0:08:49 So autonomous driving is a great example, right, which you mentioned is in order to
    0:08:53 make that work in the United States, we probably need forward-looking governments to say there
    0:08:58 are test zones or areas where we might have almost regulatory free zones for testing purposes,
    0:09:01 right, that have proper supervision, but to enable something that otherwise might not
    0:09:03 exist ahead of its time.
    0:09:07 Obviously, you’ve got specific micro issues, I mean, they can be big in their impact like
    0:09:11 driverless cars, but there is a specific thing, they’ve got specific issues attached to them.
    0:09:18 But where does the tech sector go if it wants to engage on, you know, the bigger macro question
    0:09:24 of how do you redesign government, by the way, as well as individual sectors, because
    0:09:27 government itself is going to have to change.
    0:09:28 That organization doesn’t exist today, at least.
    0:09:30 I’m not aware of where you would do that.
    0:09:34 And I think the other problem with it is we have to think beyond geographic and national
    0:09:39 borders on this stuff, because technology and capital are free-flowing in our society.
    0:09:44 You almost need a United Nations or some kind of, you know, type of organization to convene
    0:09:45 to have those discussions.
    0:09:46 Yeah.
    0:09:47 I would say there’s a couple of things, too.
    0:09:48 There’s a couple of factors.
    0:09:54 One, there are obviously lobbying entities like the NVCA, there’s the Internet Association,
    0:09:56 which a lot of major companies are a part of.
    0:10:02 Then there is a group of players, like there’s a group of think tanks and a middle layer,
    0:10:06 and then the government agencies themselves have been soliciting testimony.
    0:10:10 Cooper has actually done testimony on all kinds of topics, from Cypheus to crypto to
    0:10:12 various different topics.
    0:10:18 But what’s really interesting to me, especially, is there’s organizations like 18F and USDS
    0:10:23 in the US government, at least, where you have technologists doing literally rotating
    0:10:24 apprenticeships.
    0:10:28 It’s like the rotating missions, essentially, where they go for three years and they’re
    0:10:30 contributing to actually reinventing government systems.
    0:10:32 Now, this is a very important addition about it.
    0:10:33 Yes.
    0:10:34 I think it is, too.
    0:10:36 And what’s really amazing is that it’s got tangible impacts.
    0:10:41 So a specific example is we have a huge Veterans Administration that doesn’t get great healthcare.
    0:10:46 So they redesigned the VA site in order to make sure that people who have accessibility
    0:10:49 issues can use the site in a friendly way.
    0:10:51 There’s many more applications of the types of things they’re doing.
    0:10:53 We’ve actually had them on this podcast.
    0:10:54 But I think those are some avenues.
    0:10:56 But to Cooper’s point, there is no single entity.
    0:11:01 I will say that at Wired, I edited a big set of op-eds around the ITU, which is sort of
    0:11:03 like a UN for the internet.
    0:11:06 And it was during the WC-12 hearings, which you might recall.
    0:11:11 I think Hama Dune Touré was the head of the commission, and I edited him as well.
    0:11:16 And what’s fascinating to me is that there’s a lagging versus a leading approach to it,
    0:11:20 because you’re sort of taking the data that’s passed, not really looking forward.
    0:11:23 And that was what I saw as a big drawback when I was working with the WC-12 op-ed.
    0:11:28 So I’m curious for your take on how do we shift it, so you are listening to those being
    0:11:34 affected by technology, but with the point of view that spins it forward for future generations.
    0:11:38 Because if we had listened to all the farmers in the first wave of the Industrial Revolution,
    0:11:43 we may not have many of the things today, but their grandkids are benefiting from those
    0:11:44 things.
    0:11:45 Yeah, no, absolutely.
    0:11:48 So look, I think there are two caps that I see, and I just look at this from the side
    0:11:54 of the, as a we’re ordinary politician, because I think there are initiatives that are happening
    0:11:58 inside government where people or departments will get it, and therefore they’ll embrace
    0:12:02 it and bring in smart people to help them and so on.
    0:12:04 But I think there are two sort of lacunae.
    0:12:10 One is your average politician does not understand a lot about this, and that is not sort of
    0:12:12 a disrespect to your average politician.
    0:12:18 It’s that it’s new, it’s complicated, it takes you time to get your head around it.
    0:12:22 My eldest son is in technology, and I am always trying to get him to explain blockchain to
    0:12:23 me.
    0:12:24 We’re big on crypto.
    0:12:25 I know.
    0:12:31 I remember you sent me the other day saying, “This is the idiot’s guide to crypto currency,
    0:12:33 and I still couldn’t understand it.”
    0:12:35 I’m going to send you our crypto cannon.
    0:12:36 We took a stab at Compound Water Resources.
    0:12:37 Right.
    0:12:38 But you probably shouldn’t test me on it.
    0:12:39 But that is one lacunae.
    0:12:44 Those people have to understand this is like the 19th century Industrial Revolution.
    0:12:47 So you’ve got to get your ordinary politicians to understand it.
    0:12:51 And then there’s another lacunae which is, I think, in getting the dialogue at the top
    0:12:56 level between particularly the Americans and the Europeans, because I also think it would
    0:13:00 be immensely helpful if we had a more transatlantic approach.
    0:13:01 I think there’s a third piece.
    0:13:03 There’s an incentives problem.
    0:13:07 I would imagine if you did a survey of most politicians, they would say, “My fundamental
    0:13:11 role is how do I improve long-term economic growth and job sustainability for my constituents?”
    0:13:15 I mean, if people kind of cut through a lot of the politics, that’s really why they think
    0:13:16 they’re there.
    0:13:19 Look, they want to make a better life and make a better opportunity for their constituents.
    0:13:23 The problem we have, though, is because their short-term incentive program is to get reelected,
    0:13:25 which I understand is a good thing from a political perspective.
    0:13:29 It’s very hard for them to take that long-term view because the shorter-term opportunity
    0:13:34 is to say, “Look, I really need to do no harm to my constituents and by allowing technology,
    0:13:39 which might be in the short-term, displacing and unsettling to job growth and other stuff,
    0:13:41 particularly for different segments of the population.”
    0:13:44 It’s very hard, I would imagine, as a politician to square those two things, which is how do
    0:13:49 I help my constituents understand, you know, to Sonal’s point that, yes, over a period
    0:13:54 of time, it was a good idea to have industrialized farming as opposed to pure manual agrarian
    0:13:55 farming.
    0:13:58 But that’s an incredibly unsettling thing, particularly in the U.S. here, if every two
    0:14:01 years you have to get reelected or you go find a new job.
    0:14:07 By the way, this happened in the 19th century and you had whole new politics created around
    0:14:08 it.
    0:14:10 And I think there are two things that are important here.
    0:14:16 First of all, I think the technology will, in some way, provide solutions to what is
    0:14:23 a constant dilemma for an ordinary politician, which is we need to do more for our people,
    0:14:27 but we can’t just keep spending more and taxing more.
    0:14:31 If the technology can help unlock part of that, that is something they’re prepared to
    0:14:32 go for.
    0:14:37 And secondly, with most politicians, if they’re able to see this within a longer-term perspective,
    0:14:42 what you say to them, “Look, we’ll help you and guide you through this process of change,
    0:14:44 but in the end, it’s a beneficial change.”
    0:14:47 And what I found when I was in government is some of the most difficult reforms we put
    0:14:54 through, for example, around education reform, healthcare reform, we were able, in some ways,
    0:15:01 with at least some people, to say this short-term difficulty is going to be worth it.
    0:15:03 How did you pull that off, though?
    0:15:05 Was it the education, the explanation?
    0:15:07 Was it consensus building?
    0:15:12 I mean, let me take a very specific example, which, of course, is under attack now, but
    0:15:18 we introduced tuition fees in the UK, but my point was very, very simple, that universities
    0:15:22 are going to be major engines of economic growth in the future, in particular because
    0:15:26 of the link between university and technology and the development of technology.
    0:15:31 And therefore, we cannot afford for UK universities not to be able to get the best talent, and
    0:15:34 they’re going to have to therefore have an extended funding base.
    0:15:35 They can’t get it all from government.
    0:15:39 And my point is, if you get it all from government in the end, some governments will start to
    0:15:43 slice it away, and you’re always hand-to-mouth as universities.
    0:15:49 And I reckon when we did that, it was very difficult, in fact, it was extremely difficult,
    0:15:55 but in the end, you were able to say to people, “Look, if we want to save our position as
    0:16:00 a country that, along with the US, probably has most high-quality universities in the
    0:16:02 top 50 in the world, then we’ve got to be prepared to do that.”
    0:16:07 Now, some people, by the way, rejected it, and today it’s a big political issue again,
    0:16:12 but you can get to at least some form of alignment between long-term and short-term.
    0:16:15 It’s a fundamental rethinking of what the role of government is, quite frankly, right?
    0:16:19 Which is, again, if you take the premise that the overall objective for government is to
    0:16:23 create economic conditions that hopefully generate long-term economic growth and sustainability
    0:16:25 for individuals and companies, then you’re right.
    0:16:29 Maybe the ancillary role of government is, how do we deal with short-term issues that
    0:16:32 have market dislocations for people?
    0:16:35 Maybe that’s a more proper way to describe what the role of government is, in many cases.
    0:16:38 I think the other thing would be, I think there’s another question for politics which
    0:16:43 would be very challenging, because what would be weird is if the whole of the world is undergoing
    0:16:46 this revolution and politics is just kind of staying fixed.
    0:16:49 The type of people who go into politics, what happens often is people leave university,
    0:16:53 they’re going to become a researcher for an MP, and then they become an MP, and then they
    0:16:55 become a minister, but they have no experience of the outside world, right?
    0:16:59 So that’s one, and it becomes a constraint over time, and then the types of people who
    0:17:00 work in government.
    0:17:04 So you were saying something about the people who’ve been brought into, say, the Veterans
    0:17:06 Administration here.
    0:17:12 So how do you actually open up public service and then get a greater interoperability between
    0:17:15 public service and the private sector?
    0:17:20 Because all of the sort of pressure certainly coming from the media has been not to allow
    0:17:25 that to happen, and not to allow politicians to have anything other than they’re usually
    0:17:26 just focused on…
    0:17:29 Okay, let’s say we all agree, which I think we do, that there needs to be a connection
    0:17:32 between all the entities working together, no question.
    0:17:36 More engagement, more explanation, more understanding, thinking of consequence.
    0:17:37 I think those are all table stakes.
    0:17:41 The question now is, how do you then think about unintended consequences?
    0:17:46 Because the story, to me, is not that bad things have bad consequences.
    0:17:51 It’s that often the worst consequences come from very well-intended things.
    0:17:53 And quite frankly, the perfect example that comes to mind is GDPR.
    0:17:57 Yeah, to make it concrete, there’s been a, over the last several months, and I’m sure
    0:18:00 probably more so in Europe as well, there’s been a number of articles talking about when
    0:18:05 you look at kind of the broad impact of GDPR, essentially, it’s endured largely to the benefit
    0:18:08 of the very large incumbents, which was probably not what’s intended to do.
    0:18:09 Because they’ve got the resource to better handle.
    0:18:10 That’s exactly right.
    0:18:14 And the analogy we have here in the States was the Dodd-Frank legislation that came out
    0:18:18 of the global financial crisis, where financial institutions had to comply with a whole new
    0:18:20 set of regulations.
    0:18:24 What it really did here in the U.S. was, it really entrenched those incumbents very well,
    0:18:27 and it made it very hard for startup financial institutions to grow.
    0:18:31 It was very hard for a new institution to get a banking license for many years, in part
    0:18:33 because of the regulatory cost of doing so.
    0:18:35 And so, how do you balance that?
    0:18:38 And maybe the answer is, look, it’s an education problem, but well-meeting and politicians
    0:18:43 certainly expect and intend that regulation is the appropriate way to deal with these things.
    0:18:47 It does, in some cases, interfere with the overall goal of entrepreneurship in a startup
    0:18:48 formation.
    0:18:53 That’s why I think that the attitude of the technology sector to engagement with government
    0:18:54 is so important.
    0:18:58 Because if you’re engaging with government saying, look, we understand there’s a massive
    0:19:01 set of issues here, and we’re really going to sit down and work with you as to how we
    0:19:07 get the right answer, then government’s in a position where they regard you as a partner.
    0:19:11 But I think for this moment in time, a bit like, actually, the aftermath of the financial
    0:19:17 crisis, government kind of regards, you’re looking after yourselves, but we got to look
    0:19:19 after the public.
    0:19:21 And that’s where it leads to poor regulation.
    0:19:28 I mean, poor regulation is nearly always the consequence of a failure on the regulating
    0:19:31 side to really understand what’s going on.
    0:19:35 And on the founder’s side, what I’m hearing is to really communicate the benefits of the
    0:19:36 technology upfront.
    0:19:40 And to be so defensive that you’re just thinking all the time, how can we ward these people
    0:19:41 off?
    0:19:46 But here’s the thing, you can sometimes, if you have wealth, which a lot of these big
    0:19:51 tech players do, and power, and you also have access, and you can go and see whoever they
    0:19:57 want to see, it can sometimes mask your essential underlying vulnerability.
    0:19:58 Interesting.
    0:19:59 Right.
    0:20:05 And your vulnerability is the comes a point when suddenly the mood flips, and it doesn’t
    0:20:10 matter how much wealth and power and access you have, you’re the target.
    0:20:12 So that point, everything changes.
    0:20:16 So if you want to avoid that, I think it’s got to be a dialogue that’s structured and
    0:20:22 it requires not just things happening between the tech sector and government, but for people
    0:20:28 like my own institute, to use our sort of convening power, the political side, to say
    0:20:30 to the politicians, look, let’s get our heads around this.
    0:20:32 Here’s my essential challenge.
    0:20:37 How do you take this technological revolution and as a politician, weave it into a narrative
    0:20:39 of optimism about the future?
    0:20:40 Yes.
    0:20:41 I want that too.
    0:20:42 Right.
    0:20:43 Yeah.
    0:20:44 So what’s driving the populism is pessimism.
    0:20:46 If people are pessimistic, then they look for someone to blame.
    0:20:52 If people are optimistic, they look for something to aspire to, and that’s the essential difference.
    0:20:54 It’s really interesting also, Sonal and I have been having this conversation, and we’ve
    0:20:58 been having this conversation in the U.S. about ESG, right, which obviously, you know, certainly…
    0:20:59 Environmental social governance.
    0:21:00 Right.
    0:21:02 Which Europe is way ahead of the U.S. there, and Larry Fink, who’s the head of BlackRock
    0:21:05 here, has written this letter, you know, appealing to CEOs, and it really goes to the same issue
    0:21:09 you’re talking about, which is fundamentally, what is the role of the corporation and how
    0:21:12 do corporations think about obviously enhancing value for their shareholders, but also to your
    0:21:17 point recognizing that they impact constituents in many other ways, and I think that’s kind
    0:21:21 of the dialogue we ought to be having with politicians, which is, look, we can create
    0:21:25 a world where it’s compatible to have, you know, maximizing shareholder opportunity,
    0:21:28 but also recognizing and being a part of the broader community discussion about the impact
    0:21:29 on society.
    0:21:35 The other thing is to recognize that when we create these things, we have some obligation
    0:21:36 to share.
    0:21:40 It comes out of fundamental macroeconomics, right, which is we can improve growth for
    0:21:44 a country by either population growth and/or productivity growth, right?
    0:21:47 Those are the two levers in theory that we can impact, and if we can frame the discussion
    0:21:50 around technology, that’s a lot of where the U.S. has done well, right?
    0:21:54 We’ve generally, obviously times are changing, but we’ve generally been very open to immigration
    0:21:58 and thought about population growth as a way to help improve the lot for people generally,
    0:22:01 and we’ve also been very open to productivity growth, right, in the form of technology and
    0:22:05 automation, and if we can frame it that way, but also to your point, recognize that there
    0:22:09 are going to be disintermediations along the way, and part of our responsibility is to help
    0:22:14 from a training and education perspective, and even potentially the role of government
    0:22:19 in subsidizing the transition from less automated to more automated society.
    0:22:21 What happens to education in all of this?
    0:22:23 I don’t think we have a singular point of view on it.
    0:22:27 We have talked about education a lot on this podcast and shared a diversity of views, but
    0:22:32 I think a couple of the high level things are that universities are huge drivers, of course,
    0:22:36 as you mentioned, of innovation, and in every study of regional innovation, every innovation
    0:22:43 cluster is successful because of the collaboration between universities, government, local, entrepreneurial
    0:22:44 communities.
    0:22:47 The other key point, however, is it’s a combination of top-down and bottom-up.
    0:22:52 People who have tried top-down, industrial, planned, smart cities or things like Silicon
    0:22:53 Valley never work.
    0:22:56 The only bottom-up ones alone don’t necessarily work.
    0:22:57 You need a combination of the two.
    0:23:00 That’s the number one finding, but the second thing, and this is a big topic we talk about
    0:23:05 on this podcast, is the importance of apprenticeship and a new type of education that really thinks
    0:23:07 about skills-based education.
    0:23:12 We have this elitist attitude that education has to be a certain way when, in fact, in
    0:23:16 this day and age, especially with increased automation and the need for jobs, we might
    0:23:19 want to be really thinking about very specific skills-based education.
    0:23:24 It’s actually fascinating because, in fact, my eldest son’s got a company that’s on apprenticeships
    0:23:28 and technology, so that’s exactly what he does.
    0:23:32 I think it’s really, really interesting because of the idea that you don’t necessarily have
    0:23:33 to go to university.
    0:23:38 Well, there are alternative universities coming about too, like we’re investors in
    0:23:39 Udacity.
    0:23:40 There’s just Lambda School.
    0:23:44 There’s all these interesting types of containers where people can get what they call nano-degrees
    0:23:46 or microskills or specific skills.
    0:23:50 There’s so much that’s actually in play, because the point I want to raise here, this is kind
    0:23:55 of an underlying theme to me, is that technology, as you pointed out, you can take an optimistic
    0:23:56 view.
    0:23:59 It also gives you the means to address many of the problems that we are complaining about
    0:24:04 because when I think of some of the trade-offs between waiting for a government to update
    0:24:12 policy, what I love is that a mass of users on a platform can essentially vote with saying
    0:24:18 leave that platform, and immediately that platform is going to act the next day in a
    0:24:20 way that a lawmaker cannot overnight.
    0:24:22 Yes, from a political perspective.
    0:24:24 You want this thing at least to have some sort of rational-
    0:24:25 Of course.
    0:24:26 It shouldn’t be mobbed.
    0:24:31 But I think the other thing is, if you take, so a lot of what drove, for example, Brexit
    0:24:35 in the UK is, apart from the immigration issue, was this idea of communities, people left
    0:24:36 behind.
    0:24:41 So what is it that technology would do to go into those communities and help people gain
    0:24:45 better education, get connectivity to the world, because in the end, this is what it’s
    0:24:46 all about.
    0:24:47 If you’re not connected, you’re not really–
    0:24:48 If you’re left behind.
    0:24:49 Right.
    0:24:55 So I think one big question is, how does the technology sector help us as policymakers
    0:25:00 reach those people for whom the conversation we’ve just been having may be sort of scratching
    0:25:02 their heads and thinking about what these guys are on.
    0:25:03 That’s a fantastic question.
    0:25:05 And actually, it’s interesting because we’re investors in NationBuilder, which is one of
    0:25:10 the companies that mobilize a lot of the communities that actually organize for pro and for con
    0:25:11 around these things.
    0:25:14 So a quick thing, I do want to make sure we actually give answers because we’re asking
    0:25:15 a lot of questions.
    0:25:19 So can you both give a little bit more on what concretely we need to do?
    0:25:23 So from the point of view of my institute, what we’re doing is we’re creating a tech
    0:25:25 and public policy center.
    0:25:29 And the idea is to bring a certain number of people who really understand the tech side
    0:25:32 and a certain number of people who come from the public policy side, put them together
    0:25:33 in a structured way.
    0:25:35 I will kind of curate that.
    0:25:43 And out of it should come what I call serviceable policy and a narrative about the future,
    0:25:46 which makes sense of this technological revolution.
    0:25:50 And then to link up with politicians, not just in the UK and Europe, but actually over
    0:25:55 here and create a sense that this technological revolution should be at the center of the
    0:25:56 political debate.
    0:25:57 How do we handle it?
    0:26:00 How do we, as I say, mitigate its risks and access its opportunities?
    0:26:03 So that’s one very specific thing.
    0:26:07 And then I think the other thing, frankly, is just to be out there, myself and a number
    0:26:13 of other people, at least of access to the airwaves, to say, guys, we’ve got to switch
    0:26:14 the conversation.
    0:26:18 You’ve got to put this technology question at the heart of the political debate.
    0:26:22 Now the solution, some people may go to the left, some people may go to the right.
    0:26:24 Some people will never be in between.
    0:26:25 But make it the conversation.
    0:26:29 Put it at the top four of those priorities for every country, every organization.
    0:26:30 I think that’s right.
    0:26:32 Fundamentally, we’re talking around the issues.
    0:26:36 It’s either immigration or its income inequality or other things that drive the debate.
    0:26:40 But the fundamental question is exactly that, which is how do we move forward with broader
    0:26:41 economic growth initiatives?
    0:26:46 So sitting here in Silicon Valley, any individual company is probably better off, quite frankly,
    0:26:49 taking the break glass first and then ask for forgiveness later.
    0:26:53 And so it’s, I think, the idea of having kind of solving that collective action problem
    0:26:56 through a convening organization makes a lot of sense.
    0:26:59 But you come to the issues of very traditional income inequality.
    0:27:03 Now there is a perfectly good question as to whether you raise the minimum wage, and
    0:27:04 if so, by how much?
    0:27:07 And my government was coming to introduce the minimum wage in the UK, so I’m very familiar
    0:27:10 with all those arguments.
    0:27:17 But in the end, there is a whole other dimension to that individual, which is about the world
    0:27:20 that’s changing and their place in it, and whether they’re going to have the skills and
    0:27:21 the aptitude to be able to.
    0:27:25 So you’re just saying completely at every level reframe that technology is at the center
    0:27:26 of that.
    0:27:27 Right.
    0:27:31 So it’s not that you displace traditional questions of taxation and inequality, but the
    0:27:38 truth of the matter is it’s going to be probably in the long term more significant for that
    0:27:43 individual and for the society if this technological revolution is handled properly.
    0:27:47 So if you had a debate in the UK at the moment about our healthcare system, our national
    0:27:53 health service, it would be, should we spend 10 billion pounds a year more on it or 5 billion?
    0:27:59 But how do you change the whole of the way we implement care for people because of technology
    0:28:00 is going to have a much bigger impact?
    0:28:01 I agree.
    0:28:05 I guess the only thing I would add to this, because I think about this a lot, interestingly,
    0:28:10 is that we treat technology like this word, this homogenous, nebulous entity.
    0:28:16 And the reality is that every single instance so depends on the specific technology.
    0:28:19 So my call to action, I guess, would be to think about it very specifically.
    0:28:25 The way we think about AI, that’s such a broad phrase and it’s a very scary phrase that
    0:28:30 suggests everything from generalized intelligence to very specific automation that gets your
    0:28:33 bank account updated automatically.
    0:28:34 So I think there’s two things to this.
    0:28:38 One that we need to be incredibly specific about what technology we’re talking about,
    0:28:39 in what context.
    0:28:43 And then B, we also dial in the right degree to what we’re talking about at what point
    0:28:44 because I don’t really make a difference.
    0:28:45 I completely agree with that.
    0:28:50 I mean, I think the only thing I would say is right now we’re actually far away from
    0:28:53 even getting the specifics.
    0:28:54 Yeah, you’ve got a good point.
    0:28:55 That’s fair.
    0:28:59 You know, I was just saying to people in politics when we were campaigning and they’d say, you
    0:29:03 know, I’d say, right, because we were, we campaigned in the slogan 99% right, new labor,
    0:29:04 new Britain.
    0:29:05 Right.
    0:29:06 And they say, no, but it’s much more complicated now.
    0:29:09 I say, okay, guys, it is, but actually it’s complicated, but sometimes you need to go
    0:29:10 straight.
    0:29:11 I hear you.
    0:29:15 You’re saying that you’re saying that when you go back to your old constituency in constituency
    0:29:19 in Northern England, they don’t care about the specifics, they just need to have their
    0:29:20 fears established.
    0:29:22 The first thing that you need to persuade them of, you’ve got to say to them, guys, technology
    0:29:24 is going to change the world and we’ve got to prepare for it.
    0:29:25 They’re not there yet.
    0:29:29 When you get them there, then obviously in all sorts of different ways, but this is where
    0:29:35 I think that the gulf that there is between the technology sector and the policies and
    0:29:36 therefore the people is so big.
    0:29:40 How do we deal with the fundamental challenges that we have as we talked about earlier from
    0:29:44 an incentive perspective and short tenures on, you know, office and people’s ability
    0:29:45 to be in office?
    0:29:48 Is that a conversation you think that the country and the nations are prepared to have?
    0:29:50 I think so, but it’s a very good question.
    0:29:55 I would also say that in all of the change that’s going to happen, I mean, this is a
    0:29:58 whole topic for another podcast, probably with the different people.
    0:30:08 But how you exchange information and the validation of that information is an essential part of
    0:30:12 having a democratic debate and that is a big problem in today’s world.
    0:30:19 So I think it is possible to have that conversation with people, but all political conversations
    0:30:25 today are extremely difficult because they happen in such a fevered environment with
    0:30:31 so much polarization and the interaction between conventional and social media makes a rational
    0:30:33 debate occasionally extremely difficult.
    0:30:35 With that qualification I answer this.
    0:30:39 I think the better way to approach that problem is to say, how do we make the U.S. and/or Europe
    0:30:45 or other places attractive to entrepreneurship and encourage people to think about the regulatory
    0:30:49 framework and the economic framework as, you know, want to be participants in these markets
    0:30:53 as opposed to the anti, you know, kind of policies we have, which is let’s make it harder for
    0:30:56 free flow of capital and try to stave off those opportunities.
    0:31:00 It used to be that 90% of venture capital and entrepreneurship happen in the U.S. literally
    0:31:05 almost as early as 20 years ago, and if you look at those numbers today, it’s about 50%.
    0:31:09 And so the amount of kind of capital that’s kind of been distributed globally and therefore
    0:31:12 the amount of opportunity set distributed globally is interesting.
    0:31:15 We have to think about this beyond kind of regional borders.
    0:31:18 We will have talent in people that are free flowing across geographies.
    0:31:21 And so we have to think about this from a broader, you know, global initiative.
    0:31:25 Well, you guys, I just want to say thank you for joining the A6NZ podcast.
    0:31:26 Thank you.

    with Tony Blair (@InstituteGC), Scott Kupor (@skupor), and Sonal Chokshi (@smc90)

    If the current pace of tech change is the 21st-century equivalent of the 19th-century Industrial Revolution — with its tremendous economic growth and lifestyle change — it means that even though it’s fundamentally empowering and enabling, there’s also lots of fears and misconceptions as well. That’s why, argues former U.K. prime minister Tony Blair (who now has an eponymous Institute for Global Change), we need to make sure that the changemakers — i.e., technologists, entrepreneurs, and quite frankly, any company that wields power — are in a structured dialogue with politicians. After all, the politician’s task, observes Blair, is “to be able to articulate to the people those changes and fit them into a policy framework that makes sense”.

    The concern is that if politicians don’t understand new technologies, then ”they’ll fear it; and if they fear it, they’ll try and stop it” — and that’s how we end up with pessimism and bad policy. Yet bad regulations often come from even the very best of intentions: Take for example the case of Dodd-Frank in the U.S., or more recently, GDPR in Europe — which, ironically (but not surprisingly) served to entrench incumbent and large company interests over those of small-and-medium-sized businesses and startups. And would we have ever had the world wide web today if we hadn’t had an environment of so-called ”permissionless innovation”, where government didn’t decide up front how to regulate the internet? Could companies instead be more inclusive of stakeholders, not just shareholders, with better ESG (environment, social, governance)? Finally, how do we ensure a spirit of optimism and focusing on leading vs. lagging indicators about the future, while still being sensitive to short-term displacements, as with farmers during the Industrial Revolution?

    This hallway-style style episode of the a16z Podcast features Blair in conversation with Sonal Chokshi and a16z managing partner Scott Kupor — who has a new book, just out, on Secrets of Sand Hill Road: Venture Capital and How to Get It, and also often engages with government legislators on behalf of startups. They delve into mindsets for engaging policymakers; touch briefly on topics such as autonomous cars, crypto, and education; and consider the question of how government itself and politicians too will need to change. One thing’s for sure: The discussion today is global, beyond both sides of the Atlantic, given the flow of capital, people, and ideas across borders. So how do we make sure globalization works for the many… and not just for the few. 

    image credit: Benedict Macon-Cooney


    The views expressed here are those of the individual personnel quoted and are not the views of a16z or its affiliates. This content is provided for informational purposes only, and should not be relied upon as legal, business, investment, or tax advice. Furthermore, this content is not directed at nor intended for use by any investors or prospective investors and may not under any circumstances be relied upon when making a decision to invest in any a16z funds. PLEASE SEE MORE HERE: https://a16z.com/disclosures/

  • a16z Podcast: AI and Your Doctor, Today and Tomorrow

    AI transcript
    0:00:03 Hi, and welcome to the A16Z podcast. I’m Hannah.
    0:00:08 This episode is all about how artificial intelligence is coming to the doctor’s office, how it will
    0:00:13 impact the nature of doctor-patient interactions, diagnosis, prevention, prediction, and everything
    0:00:14 in between.
    0:00:20 The conversation between A16Z’s general partner Vijay Pandey and Dr. Eric Topol, cardiologist
    0:00:25 and chair of innovative medicine at Scripps Research, is based on Topol’s book, “Deep
    0:00:29 Medicine,” and touches on everything from how AI’s deep phenotyping can shift our
    0:00:34 thinking from population health to understanding the medical health essence of you, how the
    0:00:38 industry might respond, the challenges in integrating and introducing the technology
    0:00:43 into today’s system, what the doctor’s visit of the future might really look like, and
    0:00:47 ultimately how AI can make healthcare more human.
    0:00:50 Before we talk about technology and we talk about all the things that are changing the
    0:00:54 world or making huge impacts, it’s interesting to just think like, “What should a doctor
    0:00:55 be doing?”
    0:00:56 And how do you see that?
    0:01:02 That’s really what I was pondering and I really did this deep look into AI.
    0:01:07 I actually didn’t expect it to be this back to the future story, but in many ways, I think
    0:01:13 it turns out that as we go forward, particularly in a longer-term view, the ability to outsource
    0:01:19 so many things with help from AI and machines, I think is going to get us back.
    0:01:20 It could.
    0:01:21 It could.
    0:01:24 That’s a big if, to where we were back in the ’70s and before.
    0:01:28 What was better then was that doctors were spending much more time with us, right?
    0:01:29 Exactly.
    0:01:34 That gift of time, the human side, which is the center of medicine that’s been lost.
    0:01:40 The big business of healthcare and all of its components like electronic records and
    0:01:48 relative value units and all this stuff basically has sucked out any sense of intimacy and time.
    0:01:51 And it’s also accompanied by lots of errors.
    0:01:58 But of course, it’s not a gimme because administrators want more efficiency, more product to be.
    0:02:02 I think you put this on Twitter where it’s like some kid drew like a drawing of going
    0:02:07 to the doctor and the picture was the doctor with their back turned working on a computer.
    0:02:12 And that is what happens too much, but yet we’re talking about technology coming in.
    0:02:15 So how does this all work out that more technology means less computer?
    0:02:23 I think that is a kind of fundamental of the problem of doctors not even making eye contact
    0:02:30 and as a child to draw that picture, how unnerving that was her trip to the pediatrician natural
    0:02:35 language processing can actually liberate from keyboards.
    0:02:41 And so it’s already being done in some clinics and even in the UK in emergency rooms.
    0:02:49 And so if we keep that up and build on that, we can eliminate that whole distraction doctors
    0:02:52 and nurses and clinicians being data clerks and this is ridiculous.
    0:02:58 So the fact that voice recognition is just moving so fast in terms of accuracy and speed
    0:02:59 is really encouraging.
    0:03:03 Alex is a very basic version of voice recognition, but you’re talking about something much more
    0:03:04 sophisticated.
    0:03:07 That’s something where they’re actually doing NLP, they’re doing transcriptions of doctors
    0:03:09 and have to take notes.
    0:03:13 If you were more sophisticated, you could put an ontology onto this such that you’re
    0:03:17 not just getting like a transcript of what’s going on, but that you have very machine learning
    0:03:19 friendly organized data.
    0:03:20 Exactly.
    0:03:26 So the notes that are synthesized from the conversation are far better than the notes
    0:03:32 that you would get an Epic or Cerner where 80% are cut and pasted and they’re error laden.
    0:03:39 So I mean, just as we Google AI published in JAMA of their experience, I think it’s really
    0:03:45 going much faster because the accuracy of the transcription and the synthesized note is
    0:03:50 far better than what we have today and it exceeds professional medical transcriptionists in
    0:03:51 terms of accuracy.
    0:03:52 Yeah.
    0:03:55 And so I’m imagining like what the doctor visits like then.
    0:03:59 So we’ve got maybe NLP so the doctor doesn’t have to be transcribing and not interacting
    0:04:00 with Epic.
    0:04:02 Who knows what’s in the back end, but doesn’t even matter anymore.
    0:04:03 Right.
    0:04:10 So we’ve got billing and all that headache for like a PCP, all that’s a huge headache.
    0:04:13 That’s all part of that conversation because you say, “Well, Mr. Jones, we’re going to
    0:04:18 have you have lab tests for such and such and then we’re going to get this scan and
    0:04:20 it’s all done through the conversation.”
    0:04:22 We could bring in other technologies, right?
    0:04:28 We’ve thought about just how imaging or other type of diagnosis comes in and we’ve seen
    0:04:33 all these cool things about how machine learning can improve this, but then also it’s fine
    0:04:38 because at the same point, and you bring this up in the book that over-diagnosing also can
    0:04:39 be very difficult.
    0:04:40 Yeah.
    0:04:44 Like it was very stunning where you talked about how the incidence of thyroid cancer
    0:04:47 is going up, but the mortality’s been flat.
    0:04:48 Exactly.
    0:04:53 And so I guess the challenge will be is how can we bring in these technologies in so doctors
    0:04:56 can do the things they should be doing and not the things they shouldn’t be doing.
    0:04:57 Right.
    0:05:03 Well, I think there is getting arms around a person’s data, this whole deep phenotyping.
    0:05:10 So no human could actually integrate all this data, not only the whole electronic record,
    0:05:18 which all too often is incomplete, but also pulling together sensor data, genomic data,
    0:05:24 gut microbiome, all the things that you’d want to be able to come up with not only better
    0:05:29 diagnoses, but also the better strategy for prevention or treatment.
    0:05:34 So I think what’s going to make life easier for both doctors and for the patients is having
    0:05:39 that data fully processed and distilled.
    0:05:44 In the book, I tell the story about my knee replacement and how it was a total fiasco.
    0:05:49 Part of that was because my orthopedist who did the surgery wasn’t in touch with my congenital
    0:05:50 condition.
    0:05:57 And so that hopefully is going to be something we can transcend in the future.
    0:06:02 And this is one thing that computers do, can do very well as logistics and coordination.
    0:06:05 There’s tons of cases where you might like thyroid cancer, we’re just talking about maybe
    0:06:09 you have to bring in an endocrinologist in addition to an oncologist.
    0:06:13 And it’s shocking that often there’s no discussion there, there’s no communication.
    0:06:19 But yet the challenge in my mind is how does computer magically know things that we can’t
    0:06:20 do right now?
    0:06:27 Well, that’s I think where the complementarity, the synergy between machines and people is
    0:06:35 so ideal because we just have early satiety with data, whereas deep learning has insatiable
    0:06:36 appetite.
    0:06:38 And so that contrasts.
    0:06:45 But we have, as doctors and humans, we have just great contextual abilities, the judgment,
    0:06:53 the wisdom experience, and just the features that can basically build on that machine
    0:06:58 processing because we don’t ever want to trust an algorithm for a serious matter.
    0:07:04 But if it tees it up and we have oversight and we fit it into that person’s story, that
    0:07:06 I think is the best of both worlds.
    0:07:08 Well, here I’m really curious because what is the ground truth?
    0:07:13 Because generally we don’t trust an individual person as like, as knowing everything either,
    0:07:14 right?
    0:07:19 But the ground truth would be like a second opinion or a third opinion or a fourth opinion
    0:07:22 or even like, you know, a board to look at something.
    0:07:24 And that would be what I think most people would view as the ground truth.
    0:07:29 The ground truth, when it’s applied to training an algorithm, of course, is knowing that it
    0:07:32 is the real deal, that it is really true.
    0:07:40 And I think a great example of that is in radiology because radiologists have a false
    0:07:44 negative rate of 32 percent, false negative.
    0:07:49 And that’s the basis for most of the litigation in radiology, which is over the course of
    0:07:54 a career, a third of radiologists get sued mostly because they miss something.
    0:08:00 So what you have are algorithms with ground truths that are trained on, you know, hundreds
    0:08:05 of thousands of scans so that whether it’s a chest x-ray, a CT scan or MRI, whatever
    0:08:08 it is, that it’s not going to miss stuff.
    0:08:12 But then, of course, you’ve got that over read by the trained radiologists.
    0:08:20 So they’re, you know, I think that’s an example of how we can use AI to really rev up accuracy.
    0:08:23 Somebody is going to need to interpret the algorithms or sort of be on top and that in
    0:08:29 a sense, the doctor is freed from the stuff the doctor shouldn’t be doing, like the BS
    0:08:32 accounting or the typing or all these things.
    0:08:34 And that’s not the best use of doctors time.
    0:08:39 And it’s funny because it seems like the best use of doctors time is in understanding what
    0:08:43 these tests would mean, whether it be an AI test or a cholesterol or whatever, and in
    0:08:45 how to communicate with the patient.
    0:08:48 And so that seems to be a theme running through the book.
    0:08:51 So for the first half, in terms of the understanding, what do you think that’s going to look like?
    0:08:55 I mean, one of the things I’ve been constantly wondering about is whether there’ll be a new
    0:08:56 medical specialty.
    0:09:00 Like, you know, because you don’t have radiology without like CT or X-rays, right?
    0:09:03 So presumably you don’t have radiology a hundred years ago.
    0:09:05 I do have AIology or something like that.
    0:09:12 Saurabh Jha, who is a radiologist at Penn, he and I Penn, a JAMA editorial about the
    0:09:16 information specialist radiologists and pathologists.
    0:09:21 Their foundation is reviewing patterns and information.
    0:09:26 But what’s interesting is this an opportunity for them to connect with patients because
    0:09:27 they don’t.
    0:09:32 You know, right now radiologists never sees a patient radiologists live in the basement.
    0:09:35 The pathologists look at the slides that that group of pathologists.
    0:09:40 But they actually want to interact with patients and they have this unique insight because
    0:09:43 they’re like the honest brokers, they’re not, they don’t want to do a surgery.
    0:09:46 They want to give you their expertise.
    0:09:49 And so I think that’s what we’re going to see a pretty substantial change.
    0:09:54 And as you’re touched on this new specialty, it’ll look different than the way it is today.
    0:09:58 In that case, it almost seems like everybody is better off.
    0:10:01 The doctor is better off because of pathologists and not just looking at slides, but actually
    0:10:02 is dealing with patients.
    0:10:08 And presumably the patients better off because you have these false negatives get found.
    0:10:12 The pathologists, you know, they have remarkable discordance when they look at slides and to
    0:10:18 be able to have that basically looked at as if hundreds of thousands of them were reviewed,
    0:10:24 getting back to your second, third and nth opinion to get that as input for them to help
    0:10:26 and consult with a patient.
    0:10:28 I think it’s really a bonus.
    0:10:31 You’re talking about radiology here in pathology, but I mean, we could sort of think about this
    0:10:33 as an issue for diagnosis in general.
    0:10:36 And you know, there’s one thing you point in the book, I think really beautifully, you
    0:10:40 know, you said once trained doctors are pretty much wedged into their level of diagnostic
    0:10:42 performance throughout their career.
    0:10:46 That’s kind of an amazing thing is that doctors, I guess, go through CPE and so on.
    0:10:49 But like, you only go through med school once and that’s an intense process.
    0:10:52 You learn a lot, but you can’t go through med school all the time.
    0:10:53 No, it’s so true.
    0:10:58 And that gets me to, you know, Danny Kahneman’s book about thinking fast and slow and the
    0:11:05 system one that is the reflexive thinking that happens automatically versus what we
    0:11:11 want are reflective thinking, which is system two, which takes time.
    0:11:17 And it turns out that if a doctor doesn’t think of the diagnosis of a patient in the
    0:11:21 first five minutes is over 70% error rate.
    0:11:25 And that’s how actually that’s how much time is the average with patients.
    0:11:31 So we have the problem as you’ve alluded to of kind of a plateauing early on in a career,
    0:11:35 but we also suffer because of lack of time with the system one thinking.
    0:11:40 The thought is that the machine learning is reflecting what system two would look like
    0:11:44 because it’s trained from doctors sort of doing system two.
    0:11:45 Exactly.
    0:11:53 That brings in the integration of what would be the ground troops of thousands for that
    0:11:55 particular dataset.
    0:11:57 So I think it has a potential.
    0:12:03 And of course, a lot of this stuff needs validation, but there’s a lot of promissory studies to
    0:12:05 date that suggests that’s going to be very possible.
    0:12:11 When I think about what ML or artificial intelligence AI could do, there’s like two axes.
    0:12:17 There’s one like just scale, like the fact that you can scale up servers on Amazon trivially
    0:12:19 much more than you could scale up human beings.
    0:12:22 We could scale up a thousand servers right now or 10,000 servers right now.
    0:12:25 I don’t think we could call 10,000 doctors and get them here.
    0:12:29 The other thing you could do is that you could sort of talk about other axes is like sort
    0:12:31 of intelligence or capability.
    0:12:36 And you’re talking about both in a sense that you can scale up not just the fact that you
    0:12:42 could have like a resident or med school students sort of doing things and having a lot of them,
    0:12:48 but you in addition to that, you have in a sense a doctor through AI that’s in diagnostics
    0:12:50 that’s better than any single doctor.
    0:12:51 Right.
    0:12:55 I think that’s really what is going to be one of the big early changes in this new AI
    0:13:00 medical era is that diagnosis is going to get so much better.
    0:13:05 Right now we have over 12 million serious errors a year in the United States.
    0:13:09 And they’re not just costly, but they hurt people.
    0:13:14 So this is a real opportunity to upgrade that and that’s a much more of a significant problem
    0:13:16 than most people realize.
    0:13:20 And so so far we’ve been talking about things that feel like the sci-fi fantasy version of
    0:13:21 stuff, right?
    0:13:26 I mean like because we’ve got like this doctor of sorts through diagnosis that can do what
    0:13:30 no single doctor could do, presumably at scale it’s doing this at lower cost.
    0:13:35 It’s allowing human beings to do the things they should be doing.
    0:13:37 Is there any dystopian turn here or how does this?
    0:13:39 It knows no shortage of those.
    0:13:42 And so how could this go wrong and what can we do to prevent it?
    0:13:47 Well, I mean I think one of the things we’ve harped on is that you’ve got to have human
    0:13:48 oversight.
    0:13:54 We can’t trust an algorithm absolutely because for any serious matter, because if we do that
    0:13:59 and it has a glitch or it’s been hacked or it has some kind of adversarial input, it
    0:14:01 could hurt people at scale.
    0:14:03 So that’s one of the things that we got to keep an eye on.
    0:14:11 For example, if an algorithm gets approved by the FDA, oftentimes it’s these days it’s
    0:14:14 an in silico retrospective sort of thing.
    0:14:20 And if we just trust that without seeing how it performs in a particular venue, a particular
    0:14:24 cohort of people, these are things that we just shouldn’t accept blindly.
    0:14:30 So there’s lots of deep liabilities and it runs from of course privacy, security, the
    0:14:31 ethics.
    0:14:37 There are many aspects that are not ideal about this, but when you think about the need
    0:14:41 and how much it could provide to help medicine, I think those are the trade-offs that we have
    0:14:43 to really consider.
    0:14:47 What we should be doing now and what people could be doing now to sort of anticipate this?
    0:14:48 Or do you think it’s like too early?
    0:14:51 I mean, because people have these algorithms now.
    0:14:54 What do we see in one year versus five years versus 10 years?
    0:14:57 Well, it is rolling out in other parts of the world.
    0:15:03 I just finished this review with the NHS and that was fascinating because they are really
    0:15:04 going after this.
    0:15:10 They are the leading in the world force in genomics and now they want to be in AI.
    0:15:15 So they already have emergency rooms that are using that are liberated from keyboards
    0:15:18 and they are going after this.
    0:15:23 And this course in the middle of Brexit, so that’s kind of amazing.
    0:15:27 But China is really implementing this that you could say, well, maybe too fast because
    0:15:32 out of desperation or need, but one of the advantages that we don’t recognize with China,
    0:15:37 not just that they have scale in terms of people, but they have all the data for each
    0:15:38 person.
    0:15:40 And we have no data for each person.
    0:15:46 Basically, our data is just spread around all these different doctors and health systems.
    0:15:47 Nobody has all their data.
    0:15:52 And that is a big problem because without the inputs that are complete.
    0:15:53 For like you personally.
    0:15:54 Yeah.
    0:15:55 Yeah.
    0:15:56 Then what are you going to get out of that?
    0:16:00 So this is a, this, we are at a handicapped position in this country.
    0:16:05 And the other thing of course is we have no strategy as a nation, whereas China, UK and
    0:16:10 many other countries, they are developing or have developed planning and strategy and
    0:16:13 put in resources here as a nation.
    0:16:14 We have zero resources.
    0:16:19 In fact, we have proposed cuts to the same, you know, granting agencies that would potentially
    0:16:20 help.
    0:16:23 And so what, what should one do at that scale?
    0:16:27 Like, you know, there’s various things people propose, you know, is this something to have
    0:16:31 a new national institute of health, you know, in this area?
    0:16:35 Is there, I mean, I think when I think about the government playing a role, I think I want
    0:16:40 them to try to help them build the marketplace and set the rules, but we have to be careful
    0:16:42 that we don’t put too much regulation as well.
    0:16:46 I mean, what are, what do you, I mean, when you say we don’t, we don’t have a strategy,
    0:16:47 what’s missing?
    0:16:48 What should we be doing?
    0:16:52 We don’t have no national planning or strategies.
    0:17:00 How is AI not only for healthcare, but in general, how is it going to be cultivated and made
    0:17:02 transformative?
    0:17:07 The experience I had in the UK was really interesting because there they not only have the will,
    0:17:12 but they have a whole wing of the NHS for education and training.
    0:17:13 You just think about it.
    0:17:20 They talked about professions within medicine that are going to have a more of their daily
    0:17:21 function.
    0:17:25 So, we’re not well prepared, you know, who should take leave them?
    0:17:29 One of the problems we have that you’re touching on is our professional organizations haven’t
    0:17:31 really been so forward thinking.
    0:17:36 They mainly are centered on maintaining reimbursement for their constituents.
    0:17:43 The, you know, entities like NIH and NSF and others could certainly be part of the solution.
    0:17:47 What you want to do here, I think, is to really accelerate this.
    0:17:52 We’re in the middle of an economic crisis in healthcare, which is in the US, the worst
    0:17:53 outlier.
    0:18:00 I mean, we, we spending over $11,000 per person and we have the worst outcomes, life expectancy
    0:18:06 going down three years in a row, childhood mortality, infant mortality, maternal mortality.
    0:18:09 The worst people don’t realize that.
    0:18:14 Then you have the UK and so many other countries that are at the $4,000 per year level and
    0:18:16 they have outcomes that are far superior.
    0:18:21 So, if we use this, we could actually reduce inequities.
    0:18:27 We could make for a far better business model, paradoxically, but we’re not grabbing the
    0:18:28 opportunity.
    0:18:31 Maybe there’s another solution we could think about, which you also point to in the book,
    0:18:36 which is what if, what can we do to drive through consumer action?
    0:18:39 For instance, a lot of our healthcare is sick here, right?
    0:18:40 What happens when we get sick?
    0:18:41 That’s almost all of it.
    0:18:44 What about, what can we do to stay healthy?
    0:18:47 First thing I think of is diet and lifestyle, right?
    0:18:50 That could go a long way in so many diseases, so many things that we deal with.
    0:18:52 So, actually, you touch on diet.
    0:18:56 Before we even talking about like diagnosing whether you have cancer, should we be diagnosing
    0:18:57 what you should be having for lunch?
    0:19:01 I couldn’t agree more that that should be a direction.
    0:19:09 We have had this so naive notion that everyone should have the same diet, and we never got
    0:19:16 that right as a country, but now we know without any question that people have an individualized
    0:19:19 and highly heterogeneous response.
    0:19:21 That’s not just through glucose spikes.
    0:19:26 If you and I ate the exact same food, the exact same amount, the exact same time, our
    0:19:30 glucose response would be very different, but also triglyceride response would be different,
    0:19:32 and they don’t track together.
    0:19:37 So, what we’re learning is if you get all this multimodal data, not just your gut microbiome
    0:19:43 and sensor data and your sleep and your activity, your stress level, and what exactly you eat
    0:19:47 and drink, we can figure out what would be promoting your health.
    0:19:50 We’re not there yet, but we’re seeing some pretty rapid progress.
    0:19:55 What’s intriguing to me is that in cases, there are cases now, especially, let’s say,
    0:20:00 just glucose, where you can take technology to develop for type one or type two diabetics.
    0:20:05 Now, I’m not diabetic, but I actually had the sort of, I was about to say joy, but at
    0:20:09 least the intellectual intrigue of having a CGM on me for two weeks.
    0:20:10 Yeah.
    0:20:11 So, I got to play all these experiments.
    0:20:12 Right.
    0:20:13 Right?
    0:20:14 So, I tried white rice as brown rice.
    0:20:15 Yeah.
    0:20:16 How’s ice cream?
    0:20:20 Wine versus scotch, all the important questions one has to figure out.
    0:20:21 Exactly.
    0:20:26 And actually, it was actually a surprise to me that how, for instance, I did not spike
    0:20:27 on ice cream.
    0:20:28 Spiked on brown rice.
    0:20:29 Yeah.
    0:20:32 I don’t think I prepared to go on the ice cream diet just yet.
    0:20:33 Yeah.
    0:20:34 And I don’t think you would prescribe that either, right?
    0:20:37 But I think the idea is that it’s just different for everybody, right?
    0:20:39 So, maybe you spike on ice cream, I don’t.
    0:20:45 And that, you know, what’s I think been kind of so annoying about nutrition is that we
    0:20:48 hear all these conflicting things, but perhaps part of the reason why we’re hearing these
    0:20:50 conflicting things is that it is so individual.
    0:20:51 Exactly.
    0:20:56 And that it’s so complicated and such a fundamental data science problem that it probably takes
    0:20:57 something like machine learning to figure it out.
    0:20:59 Well, I think that’s central.
    0:21:02 If we didn’t have machine learning, we wouldn’t have known this.
    0:21:06 And only, you know, thanks to the group and the Wiseman Institute in Israel, they cracked
    0:21:07 the case on this.
    0:21:09 So, Aaron Segal’s work?
    0:21:10 Yeah.
    0:21:11 Aaron Segal.
    0:21:15 And now, it’s been replicated by many others and it’s being extended.
    0:21:17 What would be promoting your health?
    0:21:22 And right now, it’s, you know, these proxy metrics like your glucose or your lipids in
    0:21:30 the blood, but eventually we’ll see how outcomes and prevention can be fostered by your diet.
    0:21:31 It’s really kind of mind blowing.
    0:21:34 How difficult data science problem it seems nutrition is.
    0:21:35 Yeah.
    0:21:39 The problem VJ is a number of levels and the sea of data.
    0:21:44 I mean, we’re talking about terabytes of data to crack the case for each individual.
    0:21:48 So it’s not even just your gut microbiome of the species of bacteria and their density,
    0:21:54 but now we know it’s the sequence of those bacteria that are part of the story.
    0:21:59 Then you have, of course, these continuous glucose every five minutes for a couple of
    0:22:00 weeks.
    0:22:01 That’s a lot of data.
    0:22:09 Besides that, you’ve got, you know, all your physical activity, your sensors for stress,
    0:22:15 you know, your sleep data, and then even your genomics.
    0:22:20 So when you add all this together, this is a real challenge.
    0:22:23 No human being could assimilate all this data.
    0:22:28 But what’s interesting is not only at the individual level, but then with thousands of people.
    0:22:33 So take everything we just talked about, multiply by thousand or hundreds of thousands.
    0:22:35 That’s how we learn here.
    0:22:41 And so what I think is the biggest thing about the AI underappreciation is the things that
    0:22:43 we’re going to learn that we didn’t know.
    0:22:49 Like, for example, another great example is when you give a picture of a retina to international
    0:22:55 retina expert, and you say, is this from a man or a woman, the chance of them getting
    0:22:56 right is 50/50.
    0:23:02 But you can train an algorithm to be over 97, 98% accurate.
    0:23:06 And there’s so many examples like that, like you wouldn’t miss polyps in a colonoscopy,
    0:23:08 which is a big issue.
    0:23:13 Or you would be able to see your potassium through your smartwatch level in your blood
    0:23:14 without any blood.
    0:23:20 And then the imagination just runs wild, as far as what you could do when you train things.
    0:23:27 And so training your diet with this torrent of data, not just from you, but from a population
    0:23:29 is, I think, a realistic direction.
    0:23:33 And what I think is interesting about this is that it’s something where, A, we don’t
    0:23:39 need the AMA or NIA or anything else to get involved in terms of diet.
    0:23:43 And B, actually, people want to take care of these problems, because I think most people
    0:23:44 are motivated.
    0:23:45 We just don’t know what to do.
    0:23:46 Right.
    0:23:52 And so many aspects of it, like now chronobiology is really this hot topic.
    0:23:54 That’s about your circadian rhythm.
    0:23:57 And should you eat only for eight hours during the day?
    0:23:58 Well, certain people, yes.
    0:24:03 But the whole idea that there’s this thing for everyone, we got to get over that.
    0:24:08 That’s what deep phenotyping is all about, to learn about the medical health essence
    0:24:09 of you.
    0:24:13 And we haven’t had the tools until now to do that.
    0:24:14 OK.
    0:24:17 So there’s a ton of data, but a lot of it seems kind of subjective.
    0:24:18 Right?
    0:24:20 I mean, did I sleep well or not?
    0:24:24 How do you sort of overcome the fact that not everything is quantitative, like my cholesterol
    0:24:25 level?
    0:24:30 Well, it turns out that was kind of old medicine where we just talked about your symptoms.
    0:24:34 But new medicine is with all sorts of objective metrics.
    0:24:39 So a great example of this is state of mind or mood.
    0:24:43 And that’s going to be transformative for mental health, because now everything from
    0:24:52 how you type on your smartphone to the voice, which is so rich in terms of tone in a nation,
    0:24:57 to your breathing pattern, to your facial recognition of yourself.
    0:25:02 I mean, there’s all these ways to say, you know, you’re VGA, you’re really depressed.
    0:25:03 Yeah.
    0:25:04 You know that you’re depressed.
    0:25:12 So the point being is that you have objective metrics of one’s mental health as a cardiologist
    0:25:13 for all these years.
    0:25:18 I’d have these patients they come and tell me, I feel my heart’s fluttering.
    0:25:20 And I would put in the note, the heart’s fluttering.
    0:25:22 That was so unhelpful.
    0:25:26 Now I can say, well, you know, you should be able to record this on your phone or if
    0:25:33 you have a smartwatch and when your heart flutters, just send me the PDF of that.
    0:25:38 And we have the diagnosis that is real world, no longer subjective.
    0:25:39 The whole different look really.
    0:25:45 And by the way, did the patient who has the fluttering records their, their cardiogram,
    0:25:47 they don’t have to wait for me.
    0:25:53 They already have an automated read from AI that’s more accurate than a doctor.
    0:25:55 There is something very anecdotal about the doctor visit.
    0:25:57 He’s not right there in the moment.
    0:25:58 Yeah.
    0:25:59 Right.
    0:26:00 It’s a one off.
    0:26:01 Yeah.
    0:26:02 It’s a one off.
    0:26:05 And so it’s a fine thing because people wonder about, let’s say the knock on a wearable will
    0:26:08 be that it’s not like a eight point EKG or something like that.
    0:26:10 But on the other hand, it’s there with you all the time.
    0:26:11 Yeah.
    0:26:12 No, exactly.
    0:26:16 And then there’s this contrived aspect of going to see the doctor where you, a lot of
    0:26:19 people find that very stressful.
    0:26:23 And when we talk about white coat hypertension, we don’t even know what normal blood pressure
    0:26:28 is because we need to check that out in thousands, hundreds of thousands of people in their real
    0:26:30 world to find out what’s normal.
    0:26:36 We’ve already had this chaos of the American Heart Association saying that they changed
    0:26:42 the blood pressure guidelines on the basis of no data, speaking of, of lack of some objective
    0:26:43 metric.
    0:26:44 Yeah.
    0:26:47 Well, so one other area that I thought was really intriguing and just to me, this was
    0:26:52 almost paradoxical, the concept of AI being useful for empathy.
    0:26:54 Because I would have thought like, if we’re thinking about the things that a computer
    0:26:58 is good at, like multiplying numbers, that’s going to be something like they’re going to
    0:26:59 beat humans at any day.
    0:27:04 I would have thought that empathy would be the one, like the last bastion of what we’re
    0:27:06 good at and what the computer is good at.
    0:27:08 But how does AI get to empathy?
    0:27:12 Because as we started the conversation with us about how as a key part of what a doctor
    0:27:15 does, we’re like, what can AI do there?
    0:27:19 Well, we are missing that in a big way today.
    0:27:21 And how do we get it back?
    0:27:27 Well, I think how we get it back is we take this deep phenotyping, we do deep learning
    0:27:35 about the person, and that’s all outsourced with oversight for a doctor or clinician.
    0:27:42 Now, when you have this remarkable improvement in productivity in workflow and efficiency
    0:27:46 and accuracy, all of a sudden, you have the gift of time.
    0:27:54 If we just lay down, as doctors have over decades for administrators to go ahead and
    0:28:03 just drive revenue and basically have no consideration for patients or doctors, we’re not going to
    0:28:05 see any growth of empathy.
    0:28:09 We’re not going to see the restoration of care in healthcare.
    0:28:15 But if we stand up and if we say that time, all that benefit of the AI part, the machine
    0:28:19 support, and by the way, that’s also at the patient level.
    0:28:23 So the patients now, with their algorithmic support, they’re decompressing the doctor
    0:28:24 load too.
    0:28:25 Yeah, they’re doing some of it.
    0:28:29 A lot of simple things, ear infections, skin rashes and all that sort of stuff that’s not
    0:28:35 life threatening or serious, but that’s bypassing a doctor potentially almost completely.
    0:28:43 So between this flywheel of algorithmic performance enhancement, if we stand up for patients, then
    0:28:45 we have all this time to give back.
    0:28:51 Once we have time to give back, then we tap into why did humans go into the medical profession
    0:28:52 in the first place.
    0:28:57 And the reason was because they want to care for their fellow human being, but they lost
    0:28:58 their way.
    0:29:03 And now we have the peak burnout and depression and suicide in the history of the medical
    0:29:04 profession.
    0:29:08 And by the way, not just in the US, in many parts of the world, and how are we going to
    0:29:10 get that back?
    0:29:15 Because it turns out, if you have a burnout doctor, you have a doubling of errors and it’s
    0:29:20 a vicious cycle, you have errors, and they get more burnout, more depressed.
    0:29:21 So we have to break that up.
    0:29:29 And I think if we can get people, so there’s time together, and that real reason why the
    0:29:34 mission of healthcare is brought back, we can do this.
    0:29:38 It’s going to take a lot of activism, it’s not going to be easy, and it’s going to take
    0:29:39 a while.
    0:29:42 But if we don’t start planning for this now, it’s not going to happen.
    0:29:45 How do you think that changes for how do you become a doctor?
    0:29:50 I mean, getting into med school and all the training is really difficult.
    0:29:53 What does the future of medical education look like?
    0:29:59 Right now, pre-med degree is a lot of biology and chemistry, not too much effort in psychology
    0:30:06 or empathy or in statistics or in machine learning.
    0:30:07 What does that look like in the future?
    0:30:10 I think we’re missing the mark there.
    0:30:18 We continue to cultivate Brainiacs, who have the highest MCAT scores and grade point averages,
    0:30:22 and oftentimes relatively low on emotional intelligence.
    0:30:24 So we have tilted things.
    0:30:25 We want to go the other way.
    0:30:30 We want to emphasize who are the people who have the highest interpersonal skills, communicative
    0:30:35 abilities, and who really are the natural empathetic people.
    0:30:40 Because a lot of that Brainiac work is going to be machine generated.
    0:30:45 And so it’s something that we all start to lean in that direction.
    0:30:46 Yeah.
    0:30:49 Now, and it’s intriguing because I think there’s a chicken and egg problem here because I think
    0:30:51 first this has to be put in.
    0:30:55 Often in these eddies, big changes, there will be resistance.
    0:30:56 Who’s going to be fighting this?
    0:31:00 The resistance we have to anticipate is going to be profound.
    0:31:08 One of the problems is that the medical profession, it may not be ossified, but it’s very difficult
    0:31:09 to change.
    0:31:16 The only changes that have occurred rapidly, like the adoption of robotics and surgery,
    0:31:19 were because it enhanced revenue.
    0:31:21 These are none of these things are going to enhance revenue.
    0:31:24 They’re actually going to potentially be a hit.
    0:31:27 We have all these interests that this is going to challenge.
    0:31:33 But for example, we could get remote monitoring of everyone in their home instead of being
    0:31:38 in a hospital room, unless they were needing an intensive care unit.
    0:31:42 Now, do you think hospitals are going to allow that to happen?
    0:31:47 Because they could be gutted, and then they won’t know what to do with all their facilities.
    0:31:50 So the American Hospital Association is not going to like this.
    0:31:51 So I’ll be delusional here.
    0:31:53 They’re not going to like it revenue-wise.
    0:31:58 Would they say that would put patients in danger, because obviously at home you don’t
    0:31:59 have what a hospital has?
    0:32:00 Well, you know what the interesting thing is?
    0:32:03 I don’t know if you can get in more danger than going into our hospitals.
    0:32:05 One in four people are harmed.
    0:32:07 Yeah, they mean sepsis and infection.
    0:32:10 The main thing, nose and commu infections from the hospital.
    0:32:15 But other medication errors and other things, the comfort of your own home.
    0:32:19 You can actually sleep, you’d be with your loved ones, the convenient.
    0:32:22 But most importantly, just think of the difference in expense.
    0:32:29 You could buy years of broadband data plan for one night in the hospital, which is $5,000
    0:32:30 on average.
    0:32:31 It’s amazing.
    0:32:37 We have the tools to do that now, but you’re not seeing it being seriously undertaken because
    0:32:38 of the conflicts.
    0:32:43 So if you think about how all this has to actually happen, we talked about what’s possible.
    0:32:48 But if you get to nuts and bolts, it’s interesting to think who’s going to do it.
    0:32:53 Because if you take just a pure data scientist who doesn’t understand the medicine.
    0:32:58 I don’t know if that would be enough, but also I don’t know if you could take a doctor
    0:33:01 that doesn’t understand data science.
    0:33:06 And so is it going to be teams of commingial groups that get this together?
    0:33:09 Because there will be iterations between the data science and the biology and the clinical
    0:33:14 aspects that have to come one after the other to be able to make these advances.
    0:33:19 We need machines and people to get the best of both worlds.
    0:33:28 So in the book, that example of how we cracked the potassium case between Mayo Clinic cardiologists
    0:33:30 and a live core data scientist.
    0:33:36 And what was amazing about that experience to review with them was that the cardiologists
    0:33:40 thought you should only look at one part of the cardiogram, which historically known as
    0:33:45 so-called QT interval, because it was known to have something to do with potassium.
    0:33:51 But when that flunked and the algorithm was a farce, the data scientist said, well, why
    0:33:52 are you so biased?
    0:33:54 Why don’t we just look at the entire cardiogram?
    0:33:58 And by the way, Mayo, you only gave us a few million cardiograms.
    0:34:01 And why don’t you give us all the cardiograms?
    0:34:02 So then they nailed it.
    0:34:08 So the whole idea is that the biases that we have that are profound.
    0:34:15 But when you start de-biasing both the data scientist and the doctors, the medical people,
    0:34:18 then you start to get a really great result.
    0:34:23 One of the scariest stories I saw was that this algorithm was getting cancer, no cancer
    0:34:29 right with crazy high accuracy, like AUC of like 1.0, like never making a mistake.
    0:34:33 And it turned out that there was some subtle difference between like a high Tesla magnet
    0:34:39 and a low Tesla magnet, and that the patients who were very sick to start off with were
    0:34:42 always getting one type of scan, and that a human being couldn’t tell the difference,
    0:34:47 but that the machine was picking up some signal, not of whether it was cancer, no cancer,
    0:34:52 but whether they were getting like the fancy measurement or the less complicated one.
    0:34:56 Or another great example is like there’s a classic example where they’re I think predicting
    0:35:01 tumors and they had rulers for the size of the tumor on all the tumor ones.
    0:35:03 And so really ML was a great ruler detector.
    0:35:04 Yeah.
    0:35:10 The whole idea that as a pathologist we can’t see in a slide the driver mutation, but you
    0:35:13 could actually train the algorithms.
    0:35:17 So when the pathologist is looking at it, it’s already giving you what is the most likely
    0:35:19 driver mutation.
    0:35:20 It’s incredible.
    0:35:27 And that does get me to touch on the deep science side of this, which we aren’t recognizing
    0:35:31 is way ahead of the medical side, the ability to upend the microscope.
    0:35:35 You don’t have to use fluorescence or H and E. You just train.
    0:35:37 So you forget staining.
    0:35:43 The idea that you used to be hard to find rare cells, just train the algorithms to find
    0:35:44 the rare cells.
    0:35:50 I mean, we’re seeing some things in science, no less in drug discovery, in processing cancer
    0:35:57 and sequencing data and certainly in neuroscience, it’s a real quiet revolution that’s much
    0:36:01 further ahead than on the medical side because there’s no regulatory hurdles.
    0:36:05 And you make a good point because I think it’s tempting to just try to do what the human
    0:36:08 can do better or what the human can do better now, try to do as well.
    0:36:12 But now you’re talking about doing things that no human being could do.
    0:36:13 Yeah.
    0:36:17 Imaging plus genomics where the genomics read out, let’s say, or whatever the blood assay
    0:36:18 is, the gold standard.
    0:36:21 I don’t want to predict what the pathologist would say.
    0:36:25 I want to predict the biopsy, I want to predict the blood or whatever the true gold standard
    0:36:26 is behind it.
    0:36:27 Right.
    0:36:30 And if you’re training on the best labels, you can do things that no human being could
    0:36:31 do.
    0:36:36 Well, you know, this may be the most important point is that we have to start having imagination
    0:36:44 because we don’t even have any idea of the limitless things that we could teach machines.
    0:36:46 Because I’m getting stunned almost on a weekly basis.
    0:36:49 I said, I never would have thought of that.
    0:36:52 And so just fast forward, here we are in 2019.
    0:36:53 What’s it going to be like?
    0:36:56 You know, a few years of all the things, like when the Mayo Clinic told me they could look
    0:37:01 at a 12-lead cardiogram for millions and be able to say this person’s going to get
    0:37:06 atrial fibrillation in their life with X percent probability, I said, really?
    0:37:07 And they’ve done it.
    0:37:10 And so I never would have expected that.
    0:37:11 Yeah.
    0:37:16 That’s a really fun point because, and you could either think of two ways that the human
    0:37:21 beings aren’t being imagined enough or what does imagination mean for an algorithm?
    0:37:22 Right.
    0:37:28 Well, if we get into heavy and unsupervised learning, we’re a bit limited by the annotation
    0:37:31 and the ground truth going back to that.
    0:37:35 You can only imagine things when you have those for supervised learning.
    0:37:40 But you know, as we go forward, we’ll have more of those data sets to work with and we’ll
    0:37:47 be better at going forward without, well with federated data sets and unsupervised learning.
    0:37:50 So the opportunities going forward are pretty enthralling.
    0:37:51 Yeah.
    0:37:54 Well, the unsupervised learning is interesting because you can finally just, you know, and
    0:37:57 for those who aren’t familiar with the term, it’s kind of like trying to find the clusters
    0:38:01 to sort of not have the labels, but to see the lay of the land.
    0:38:04 And that’s interesting because no human being can sort of, especially in high-dimensional
    0:38:07 space, like visualize that and see that.
    0:38:08 And so that’s one thing.
    0:38:13 The second thing is that if you just throw all of the data in and maybe have the algorithm
    0:38:18 make sure that it’s not overfitting, that it’s not trying to find an overly complicated story
    0:38:23 almost like, you know, these conspiracy theories are like human beings overfitting for the
    0:38:26 moon landing being a hoax or something like that when there’s a simpler explanation for
    0:38:27 things.
    0:38:30 If you keep it to a simple explanation, the computer can try everything.
    0:38:31 Yeah.
    0:38:32 Yeah.
    0:38:33 So like you talked about, it could look at the whole cardiogram.
    0:38:39 We could look at things that we don’t look at because either we’re expert enough to
    0:38:42 know that couldn’t possibly write even if it is.
    0:38:43 Or we just don’t have the time.
    0:38:48 It reminds me sometimes these algorithms almost like children in that kids just don’t know
    0:38:49 until they’ll try things.
    0:38:50 Yeah.
    0:38:53 And that’s where imagination and creativity often comes from.
    0:38:55 I couldn’t agree with you more.
    0:38:58 So we’ve been spending a lot of time talking about diagnosis, but prediction is another
    0:39:00 thing that is really important.
    0:39:03 I’d call that a real soft spot in AI.
    0:39:09 And I told the story of my father-in-law who kind of was my adopted father just in the
    0:39:13 book about how he was on death’s door.
    0:39:19 He was about to come to our house to die and he was resurrected.
    0:39:23 But any algorithm would have said he was a goner.
    0:39:29 And so the idea that at the individual level, you could predict accurately whether it’s
    0:39:33 end of life or when you’re going to die or in the hospital.
    0:39:37 This is how long you’re going to stay or you’re going to be readmitted, all these things.
    0:39:38 We’re not so good at that.
    0:39:45 We can have a general sense from a population level, but so far prediction hasn’t really
    0:39:51 panned out nearly as well as classification, diagnosis, triage, that kind of stuff.
    0:39:57 And I still think that that’s one of the shakier parts because then you’re going to tell a
    0:40:01 person about a prediction, we’re not very good at that.
    0:40:06 When we talk to people with cancer and we tell them their prognosis, it’s all over
    0:40:08 the place in reality.
    0:40:13 And so the question is, are algorithms really going to do better or are they just going
    0:40:17 to give us a little more precision, maybe not much?
    0:40:21 Is there enough information to ever predict anything like that?
    0:40:25 Well, that’s a part of the problem too is that the studies that have been done to date,
    0:40:30 things like predicting Alzheimer’s, predicting all sorts of outcomes you can imagine, they’re
    0:40:36 not with complete data, they’re just taking what you can get, like what’s in one electronic
    0:40:41 health record, one system, rather than everything about that person.
    0:40:43 So maybe it will get better when we fill in the holes.
    0:40:47 I always think about what would be the interesting challenges to work on.
    0:40:48 That’s like one of the most interesting ones.
    0:40:54 I think it is because there you could improve the efficiency if you knew who are the people
    0:40:58 at the highest risk and who you want to change the natural history or what their algorithm
    0:41:01 is predicting if it’s something that’s an adverse outcome.
    0:41:06 So eventually we’ll probably get there, but it isn’t nearly as refined as the other areas.
    0:41:11 But if you combine all these things together, this thing where you’re monitoring your body
    0:41:16 every five minutes and your diet and your exercise and your drugs and you have all this
    0:41:19 longitudinal data, that’s something that no one’s ever had before.
    0:41:27 Yeah, well, you’re bringing up a big hole in the story, which is multimodal data processing.
    0:41:29 We are not doing it yet.
    0:41:35 Like a perfect example is like in diabetes, people have a glucose sensor and the only
    0:41:40 algorithm they have tells them if the glucose is going up or down, that’s pretty dumb.
    0:41:45 Why isn’t it factoring in everything they eat and drink and their sleep and activity
    0:41:47 and the whole works.
    0:41:51 Some day we’ll have multimodal algorithms, but we’re not there yet.
    0:41:55 Well, so let’s go back to where we start, you know, a visit to the doctor in the future.
    0:42:01 And like the good news is that the doctor doesn’t have to do any of the typing or recording
    0:42:06 AIs sort of figuring out the diagnosis and that the doctor has all the time now to actually
    0:42:09 be empathetic and communicate, which is great.
    0:42:11 But is that now all that’s left?
    0:42:15 No, no, not at all, because human touch.
    0:42:18 So when you go to see a doctor, you want to be touched.
    0:42:21 That’s the exam part of this.
    0:42:28 People, when they get examined for their heart and you don’t even take off their shirt, they
    0:42:30 know, they know there’s a shortcut going on.
    0:42:31 Yeah, that’s interesting.
    0:42:41 They have a thorough exam because they know that that’s part of the real experience.
    0:42:44 And so what we’re talking about is the exam may change.
    0:42:46 Like, you know, for example, I don’t use a stethoscope.
    0:42:50 I use a smartphone ultrasound and do an echocardiogram.
    0:42:54 And I show it to the patient together as we’re doing it in real time, which the person would
    0:42:55 never see.
    0:42:59 And by the way, they wouldn’t know what love dub looks like, but you sure can see or sounds
    0:43:02 like, but you sure can show them.
    0:43:09 So the tools of the physical exam may change, but the actual hands-on aspects of it and
    0:43:15 the interaction with the person, the patient, and that that’s the intimacy.
    0:43:16 And we’ve lost that too.
    0:43:23 You know, the physical exams have really gotten very much a detraction from what they used
    0:43:24 to be.
    0:43:25 I mean, we need to get back to that.
    0:43:28 That’s what people want when they go see a doctor.
    0:43:33 And people have deprecated exams because essentially they said they weren’t of value, but it sounds
    0:43:36 like what was being done was not the part that needed to be done.
    0:43:41 Well, when you’re dealing with analog tools and, you know, they can be so superseded by
    0:43:42 the things we have today.
    0:43:46 And when you’re sharing them with the patient, so here’s what you have.
    0:43:51 And then you send them the video files or the metrics that they can look at, you know,
    0:43:54 when they get home and get more familiar with their body.
    0:44:00 It’s not only the physical exam that happens instantaneously in the encounter, but the
    0:44:05 ability to have that archived data that people get more, they learn about themselves.
    0:44:08 That’s all part of that awareness that’s important.
    0:44:11 And you know, you talked about back to the future, there might be another sci-fi analogy.
    0:44:15 I think there’s some Star Trek episodes like this where actually the group that has the
    0:44:18 highest technology is the one where the technology is invisible.
    0:44:19 Yeah.
    0:44:23 And it sounds like that’s where the, all of this is going to be in the background.
    0:44:24 That’s right.
    0:44:28 You really are interacting with a person and this person now has just these powers that
    0:44:29 they couldn’t have before.
    0:44:30 Yeah.
    0:44:31 I’m with you all the way.
    0:44:32 Well, thank you so much.
    0:44:33 This has been fantastic.
    0:44:34 I really enjoyed it.

    with Eric Topol (@EricTopol) and Vijay Pande (@vijaypande)

    Artificial intelligence is coming to the doctor’s office. In this episode, Dr. Eric Topol, cardiologist and chair of innovative medicine at Scripps Research, and a16z’s general partner on the Bio Fund Vijay Pande, have a conversation around Topol’s new book, Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again. What is the impact AI will have on how your doctor engages with you? On the nature of the doctor’s visit as a whole? How will AI impact not just doctor-patient interactions, but diagnosis, prevention, prediction, medical education, and everything in between?

    Topol and Pande discuss how AI’s capabilities for deep phenotyping will shift our thinking from population health to understanding the medical health essence of you, how the industry might respond and the challenges in integrating and introducing the technology into today’s system—and ultimately, what that the doctor’s visit of the future might look like.

  • 339: Low Content Publishing: Can You Make Money Selling Blank Books on Amazon?

    This week I’m excited to introduce one of the hottest side hustles of the moment, and that is low-content self-publishing.

    What is low-content publishing?

    Let me explain — self-publishing has been around for a decade, and we’ve covered that quite a bit in both fiction and non-fiction — it’s one of my favorite and perhaps most passive side hustles. Write the book, hit publish, collect royalties for years.

    Cool, right?

    Well, it’s the whole write the book part where a lot of people get stuck.

    You might not know what to write about, and even if you do, it can be really time-consuming, and at the end of the day, it still might turn out to not be a huge seller on Amazon.

    What low-content publishing aims to accomplish is to accelerate your product creation by focusing on a very specific sub-set of books:

    • journals
    • diaries
    • planners
    • notebooks
    • sketchbooks
    • and more

    With these types of books, the value doesn’t come from your years of experience and a 35,000-word brain dump. Instead it comes from how you’ve structured the mostly-blank internal pages and prompts, and who you’re targeting as your customer.

    With Amazon’s print-on-demand KDP print service, you can upload these products as digital files, hold no physical inventory, and collect passive royalties whenever they sell. But there’s an art and a science to it, and that’s why I’ve assembled a panel of experts in today’s show.

    I’m joined by three experts in the low-content publishing space with more than 1,000 titles between them:

    Rob Cubbon from RobCubbon.com – Long-time listeners might remember Rob from episode 81, back in 2014. Rob’s been in the low-content game for the last year or so and in that time has published over 1,000 titles.

    Flav Maderios from SideBusinessLaunch.com – Flav was a guest on episode 300 of The Side Hustle Show last summer, where we were talking about his merch business. Since then he’s expanded to the self-publishing space with around 300 titles so far.

    Rachel Harrison-Sund from RachelHarrisonSund.com – Rachel built her low content business to 6-figures a year on a very part-time basis.

    But as you’ll hear in this episode, there’s some seasonality that comes into play, there are some competitive factors that come into play, and there’s a little bit of a gold rush feel to all of this.

    Tune in to hear how these low content publishers go about their product research (so they don’t waste their time), how they price and market their books to maximize sales and royalties, and how they manage such wide-ranging portfolios.

    Full Show Notes and PDF Highlight Reel: Low Content Publishing: Can You Make Money Selling Blank Books on Amazon?

  • 381. Long-Term Thinking in a Start-Up Town

    Recorded live in San Francisco. Guests include the keeper of a 10,000-year clock, the co-founder of Lyft, a pioneer in male birth control, a specialist in water security, and a psychology professor who is also a puppy. With co-host Angela Duckworth, fact-checker Mike Maughan, and the Freakonomics Radio Orchestra.

  • a16z Podcast: 10+1 Lessons from Serial Entrepreneur Justin Kan

    AI transcript
    0:00:03 The content here is for informational purposes only,
    0:00:05 should not be taken as legal business tax
    0:00:07 or investment advice or be used to evaluate
    0:00:10 any investment or security and is not directed
    0:00:14 at any investors or potential investors in any A16Z fund.
    0:00:18 For more details, please see a16z.com/disclosures.
    0:00:21 – Hey, I’m Andrew Chen from Injuries and Horowitz
    0:00:24 and today we have Justin Kahn, who is one
    0:00:26 of our repeat entrepreneurs that we are very excited
    0:00:28 to be working with on Atrium.
    0:00:31 And so we’re gonna talk a bunch about
    0:00:33 what is it like to be a repeat entrepreneur?
    0:00:34 I think we were just going through the list.
    0:00:37 There’s like five different companies on there
    0:00:39 and you’ve learned a ton from every single one.
    0:00:42 And so we’re gonna do a series of sort of compare
    0:00:45 and contrast across quite a number of topics.
    0:00:47 But as a very first step,
    0:00:50 I think it’d be awesome to have Justin talk about
    0:00:53 some of the companies that you’ve been involved in.
    0:00:56 And I know the company you were running when we first met
    0:00:57 where you were running around with a camera on your head
    0:01:00 is not actually even your first one.
    0:01:01 There’s one before that called Kiko.
    0:01:03 So why don’t you talk about Kiko first?
    0:01:04 – Sure, yeah.
    0:01:06 So I’ve been an internet entrepreneur here
    0:01:09 for the last 14 years since 2005.
    0:01:11 Our very first company was called Kiko.
    0:01:13 It was kind of like Google Calendar,
    0:01:16 but it came out one month before Google Calendar came out
    0:01:20 and wasn’t really that good when I’m honest about it.
    0:01:23 And so that company didn’t work out super good.
    0:01:26 We ended up fire selling it on eBay
    0:01:28 after several failed acquisition attempts
    0:01:30 with the Sullivan Valley players.
    0:01:34 And then after we did that, we started another company.
    0:01:39 This one was even less well thought out than Kiko was actually,
    0:01:42 which was the idea was we would create our own
    0:01:45 live video reality TV show on the internet.
    0:01:48 I do think that tapped into several things
    0:01:49 that were in the zeitgeist
    0:01:50 that actually have become popular now.
    0:01:54 But unfortunately, we as talent in our own show
    0:01:56 were not very entertaining and not very popular.
    0:01:58 And so we launched this live streaming show.
    0:02:00 We called it Justin TV.
    0:02:03 ‘Cause I was the only one in our four founders
    0:02:05 who was stupid enough to put the camera on his head
    0:02:09 and be like, I’ll make myself the subject of this show.
    0:02:12 And you literally wore, I remember you wore a backpack.
    0:02:15 – Yeah, so this was 2007.
    0:02:19 So it was pre-iPhone, pre-good cellular internet.
    0:02:21 So we had this computer in a backpack
    0:02:25 with like multiple cell phone modem connections.
    0:02:27 And we hooked it up to a camera.
    0:02:30 So there was a camera, this computer virtualized the video
    0:02:33 as a webcam, basically sent it over to the server.
    0:02:36 And then we had this very hacky way of streaming it out
    0:02:39 to the millions of people watching,
    0:02:40 actually not millions, hundreds,
    0:02:43 but the people who were watching at home,
    0:02:46 they were following along, it was actually pretty fun.
    0:02:48 They could text, we put our number up there,
    0:02:51 they would text me, usually fairly offensive things actually.
    0:02:55 But then eventually we launched this show,
    0:02:57 people started coming because they were like,
    0:02:59 what is this guy doing, this is crazy.
    0:03:02 They were like, you’re very boring, I hate your show,
    0:03:04 but I want to create my own live video stream.
    0:03:05 So how are you doing it?
    0:03:07 And then the light bulb kind of went off
    0:03:10 and we said, aha, let’s create a live video platform.
    0:03:12 And that became Justin TV, the platform.
    0:03:15 And then after that, we ran that for a couple of years,
    0:03:17 I’ve condensed it because it’s a very long story,
    0:03:19 but we raised a bunch of money,
    0:03:21 ran it for a couple of years,
    0:03:24 hit the nuclear winter of video startups
    0:03:26 where all the other video startups in Silicon Valley died.
    0:03:30 We were pretty scrappy and survived on like a ramen budget.
    0:03:33 Eventually decided we needed to pivot to some new ideas.
    0:03:36 And from there we incubate a few ideas internally.
    0:03:38 One of them was a video app called social cam,
    0:03:40 which we spun off and eventually sold to Autodesk.
    0:03:45 And another one was a site that we,
    0:03:48 my co-founder really thought of,
    0:03:50 which was the idea was focus on,
    0:03:53 let’s focus on the video game related video on our site.
    0:03:57 And that became Twitch, which kind of grew and grew and grew.
    0:03:59 We eventually pivoted the entire company to Twitch.
    0:04:02 My co-founder Emmett was the CEO of the company
    0:04:06 and eventually sold it to Amazon for $970 million in 2014.
    0:04:08 About five years ago.
    0:04:10 Along the way started some other companies
    0:04:11 to start this company called exec,
    0:04:14 which is in the errand running/home cleaning space.
    0:04:16 It’s kind of like a handy year home joy.
    0:04:19 We actually ended up selling it to handy,
    0:04:22 which through an act of God sold to Angie’s List this year.
    0:04:26 And then more recently,
    0:04:28 I’d been a partner Y Combinator for a couple of years
    0:04:30 and then incubated a few companies.
    0:04:34 One of those was this video Q&A app called Whale.
    0:04:37 And then just fast forwarding all the way
    0:04:38 up till the present day,
    0:04:42 decided I want to really put all my eggs back in one basket,
    0:04:45 precariously thrown around basket.
    0:04:47 And so I decided I’d start a new company.
    0:04:48 That idea was Atrium,
    0:04:51 which is a technology enabled law firm for startups,
    0:04:53 really trying to solve all the problems that I had
    0:04:56 as an entrepreneur when dealing with legal.
    0:04:58 To make legal faster, more price predictable,
    0:05:00 more transparent for me, the business owner,
    0:05:02 you know, the business manager.
    0:05:05 And that’s what we set out to do about two years ago.
    0:05:06 It’s going pretty well.
    0:05:08 We’re serving a bunch of startups here in Silicon Valley
    0:05:09 with all of their needs.
    0:05:12 And it’s been great, but I’m sure we’ll get to that.
    0:05:13 – Awesome, yeah.
    0:05:18 Well, I think one place where I’m going to start on this
    0:05:23 is a lot of the advantages of being a repeat entrepreneur
    0:05:25 are that, you know, you can raise more money
    0:05:29 and there’s more, you know, maybe easier to recruit talent.
    0:05:30 And there’s, you know, there’s all these advantages
    0:05:32 that kind of, you know, come along with that.
    0:05:35 One of the disadvantages I find ends up being that,
    0:05:38 you know, there’s so many like distractions, right?
    0:05:39 Like you could, there’s a million different things
    0:05:40 you could do.
    0:05:42 There’s a lot of things pulling at your attention.
    0:05:45 I’m really fascinated by, you know, your movement
    0:05:48 from going from YC and an incubator
    0:05:51 where you can maybe kind of dip into a lot of little things
    0:05:53 versus kind of putting all your eggs in one basket
    0:05:54 and trying to like start a company.
    0:05:56 Like, you know, to talk to us about that,
    0:05:57 that kind of decision.
    0:05:59 – Yeah, that’s a great question.
    0:06:01 So once you become, you know, successful in some way
    0:06:03 in Silicon Valley, whether that’s, you know,
    0:06:04 you’ve been the executive at a company
    0:06:06 that’s, you know, rocket ship unicorn
    0:06:08 or you’ve started a company, you know,
    0:06:09 the world opens up, right?
    0:06:11 People want you to be a VC.
    0:06:15 They want you to, you know, work on projects with them.
    0:06:18 You can start any company that you want,
    0:06:19 which is great, right?
    0:06:20 But there’s just paradox of choice
    0:06:22 and focus can be a huge problem.
    0:06:25 I know for a lot of friends of mine, it has been as well.
    0:06:27 As an investor, as a partner at YCom,
    0:06:28 there were some parts I really liked.
    0:06:31 You know, I loved helping early stage founders out
    0:06:34 and like really working with them on these problems that,
    0:06:36 like I felt like for five to 10% of them,
    0:06:37 it was like life changing, right?
    0:06:39 Like me helping them out in a way,
    0:06:40 I came up with some great idea
    0:06:42 or helped them at a critical moment.
    0:06:43 That was life changing.
    0:06:44 But then there was a large majority of them
    0:06:46 that probably could have been listening
    0:06:48 to a YouTube video of me or this podcast
    0:06:50 and like you get the same information, right?
    0:06:54 And so I didn’t feel like the feedback cycle
    0:06:56 was fast enough also as an investor.
    0:06:57 And I didn’t really feel ultimately,
    0:06:58 like after the first couple of years
    0:07:00 that I was continuing to learn and grow.
    0:07:02 And you know, I’m still in my 30s.
    0:07:03 I was like, I need to do something
    0:07:05 where I’m going to be forced to grow.
    0:07:09 And really the number one vehicle for personal growth
    0:07:11 that I’ve ever experienced in my life has been startups.
    0:07:13 You know, so I went back to when I knew
    0:07:17 and I really decided that in order to grow as a founder,
    0:07:19 you know, we had a pretty big outcome with Twitch
    0:07:21 and I’d seen a lot of different things
    0:07:24 in order to really grow to the next level.
    0:07:26 I would have to do something that potentially
    0:07:27 could be even bigger.
    0:07:29 And so I really felt like it deserved,
    0:07:30 that deserved my full attention.
    0:07:33 And so that’s kind of how I decided.
    0:07:35 I don’t necessarily think it’s the right answer
    0:07:37 for everyone because what I didn’t mention
    0:07:39 was that I had selective memory at the time
    0:07:42 and I forgot just how painful starting a startup could be.
    0:07:45 And so for the first couple months of atrium,
    0:07:46 I was like, oh man, this is a dream.
    0:07:47 I’ve like finally leveled up.
    0:07:48 I learned all these skills.
    0:07:51 I’ve like, I made it as a founder.
    0:07:53 And then reality boom set in.
    0:07:56 And of course there were nothing ever goes according to plan.
    0:07:58 There were pains, there were struggles.
    0:08:01 And that was, you know, that’s part of the journey.
    0:08:03 But then the good part is, of course,
    0:08:06 when you experience pain, that is a catalyst for learning.
    0:08:08 And so I really got what I wanted in the end,
    0:08:10 which was forced to learn.
    0:08:11 – Right, that’s great.
    0:08:13 And I know one of the big differences
    0:08:16 among some of the companies that you started
    0:08:17 in the past versus atrium,
    0:08:20 and something that you’ve talked a lot about is,
    0:08:21 you had this sort of succession
    0:08:24 of very like consumer oriented startup,
    0:08:26 sort of like watching other people play video games.
    0:08:28 Like that’s as consumers, basically.
    0:08:33 And so very interestingly, atrium is a B2B thing.
    0:08:36 And why did you choose B2B?
    0:08:37 Was it just for novelty?
    0:08:39 Was it just to push yourself?
    0:08:43 Or do you think that there’s something different in mind
    0:08:47 that maybe takes advantage of some of your new found skills?
    0:08:48 – Yeah, well, Twitch, I guess, is really the,
    0:08:50 it’s like the ultimate consumption thing,
    0:08:51 ’cause it’s you’re consuming someone,
    0:08:53 consuming video games.
    0:08:54 – Right, right.
    0:09:00 – For me, I felt like maybe this was an analysis
    0:09:01 I’ve done like retroactively,
    0:09:03 but I think I’ve had this discussion
    0:09:05 with a couple people with multiple time founders.
    0:09:07 And when you’re an early stage founder,
    0:09:09 when we started Kiko and then Twitch,
    0:09:10 when we started Twitch, we were like,
    0:09:12 or just in TV, it was like we were 23 years old.
    0:09:14 And we had no skills.
    0:09:17 I never had a real full time job in my life.
    0:09:18 And even though we were programmers,
    0:09:20 we were like new college grad programmers.
    0:09:22 We were not good.
    0:09:24 We were horrible managers.
    0:09:26 We basically had nothing going for us.
    0:09:28 When you have nothing going for you,
    0:09:31 except for your like willing to put in hard amount,
    0:09:35 like long hours on a lot of blood, sweat, and tears,
    0:09:38 then you should focus on things that are like,
    0:09:40 where there’s a lot of market risk,
    0:09:43 because ideas where there’s market risk,
    0:09:45 you can potentially win those, right?
    0:09:50 Now, as someone who has abilities and skills,
    0:09:53 where I’ve learned skills over the last 14 years,
    0:09:57 you wanna focus much more on like execution risk things.
    0:09:59 And so I felt like B2B startups
    0:10:01 are more about execution risk.
    0:10:03 I felt like this legal market,
    0:10:05 particularly was already a big market,
    0:10:06 and you just have to figure out
    0:10:08 how to do it 10 times better, right?
    0:10:09 And I felt like I had a roadmap
    0:10:11 for how to do that in my head.
    0:10:14 And so that’s why I felt that this was a better use of time.
    0:10:17 ‘Cause some of the consumer startups that I had incubated
    0:10:18 and played around with post Twitch,
    0:10:20 actually it was really hard to find product market fit, right?
    0:10:22 Like, I don’t have any advantage
    0:10:25 in finding product market fit with a consumer app
    0:10:27 more than the 22 year old Justin.
    0:10:28 – You might have a disadvantage.
    0:10:29 – Yeah, I probably have a disadvantage
    0:10:31 ’cause I’m like already more set in my ways.
    0:10:34 I’m like less in tune with the culture.
    0:10:36 I’m like, I don’t know what the kids are doing.
    0:10:40 So, with the Justin of today, invent the Twitch of today.
    0:10:41 Like, I don’t think so, right?
    0:10:43 Like I’m an old guy now, man.
    0:10:44 (laughing)
    0:10:45 It’s game over for me.
    0:10:48 – I wanna unpack this market risk, execution risk.
    0:10:49 ‘Cause that’s obviously,
    0:10:51 it’s such an important distinction,
    0:10:55 but very colloquial, kind of in our understanding of it.
    0:10:57 What do you mean by market risk
    0:10:59 and sort of maybe new entrepreneurs
    0:11:00 can have an advantage in market
    0:11:02 and tackling something with new market?
    0:11:04 How do you know if an idea has a lot of market risk?
    0:11:05 – Sure, that’s great.
    0:11:08 So, Twitch and H&M are the perfect examples, right, almost.
    0:11:10 So, Twitch, it’s like when we started,
    0:11:12 when we pivoted Justin DV to Twitch,
    0:11:13 or even Justin DV is a good example,
    0:11:15 but Twitch is the best one probably.
    0:11:17 When we pivoted Justin DV to Twitch,
    0:11:20 nobody believed that there would, this was a market, right?
    0:11:22 No one believed, no investors,
    0:11:25 very few of our even internal employees believed,
    0:11:27 and even the founders were skeptical.
    0:11:29 Emmett deserves the credit here, ’cause he had belief,
    0:11:31 but a lot of the other co-founders were skeptical.
    0:11:34 I’m like, does this exist as a business?
    0:11:37 And so, the good part is that the competition there
    0:11:39 was very low, right?
    0:11:41 There weren’t experienced entrepreneurs being like,
    0:11:43 this is gonna be a huge business, we should compete.
    0:11:47 So, really it was, the entire thing was market risk
    0:11:49 and figuring out, do we have product market fit,
    0:11:50 how do we build product market fit?
    0:11:51 – ‘Cause in that case,
    0:11:53 you’re trying to be the first in the category.
    0:11:54 There’s no substitutes really.
    0:11:57 You’re watching someone else play street fighter
    0:11:59 at the arcade, that’s a substitute.
    0:12:00 – Exactly, you have a lottery ticket, right?
    0:12:01 – You have a lottery, yeah.
    0:12:04 – It’s a lottery ticket, and if you pivot a bunch of times
    0:12:05 and listen to your customers,
    0:12:06 you might be buying more and more lottery tickets.
    0:12:08 But your lottery tickets are just as valuable
    0:12:11 as the experienced entrepreneur’s lottery tickets
    0:12:12 that he’s buying.
    0:12:14 So, he’s a fool to play that game,
    0:12:16 and I don’t think you see as many experienced entrepreneurs
    0:12:18 playing that same game.
    0:12:20 Whereas, with execution risk businesses,
    0:12:23 my lottery ticket now is way bigger than the guy
    0:12:26 who’s like the 22-year-old Justin, right?
    0:12:30 So, for a B2B startup, I know, oh, I can attract talent.
    0:12:32 I can hire a sales team.
    0:12:34 I can raise capital.
    0:12:36 And so, it’s a lot more, for something
    0:12:39 where it’s very established that that’s gonna be a business,
    0:12:41 or some business is gonna be in there,
    0:12:45 it’s like he’s stupid to play against me, you know?
    0:12:47 – Right, that makes sense.
    0:12:48 Well, you know, and I always find it funny
    0:12:51 that in the consumer startup world,
    0:12:54 that if you look at the last kind of decade of hits,
    0:12:56 if you were to tell people, oh, yeah,
    0:12:58 the biggest hits are gonna be this app
    0:13:00 that lets you get in strangers’ cars,
    0:13:02 this app that lets you stay at someone
    0:13:04 you don’t know’s like house,
    0:13:07 an app where you swipe left and right
    0:13:09 in order to meet people on the internet,
    0:13:12 and an app that lets you watch other people play video games.
    0:13:16 You would be like, that’s the craziest list of
    0:13:18 billion-dollar companies I’ve ever heard.
    0:13:21 And yet, that that is actually how consumer
    0:13:23 the internet actually, you know,
    0:13:25 the ecosystem unfolds, it’s insane.
    0:13:27 – It takes a lot of people who have nothing to lose
    0:13:30 to discover those ideas, right?
    0:13:33 – Right, yeah, and so one of the things,
    0:13:35 one of the clear advantages in all of this
    0:13:38 is that, you know, let’s talk about fundraising,
    0:13:42 and the decision on whether or not to raise
    0:13:44 a bunch of money out of the gate,
    0:13:48 versus doing the kind of, you know,
    0:13:51 ramen profitable, you know, cockroach thing, right?
    0:13:54 That’s sort of like, you know, one common contrast,
    0:13:55 and then obviously you had a very unique
    0:13:57 fundraising strategy as well.
    0:13:59 So maybe talk about that decision,
    0:14:01 and then kind of how you ended up pursuing it.
    0:14:03 – Look, I’m not convinced that raising a ton of money
    0:14:05 out of the gate is the right strategy.
    0:14:08 You know, the Silicon Valley is littered
    0:14:10 with dead bodies of these companies,
    0:14:13 you know, the Juceros, and the Kools,
    0:14:15 and like all these companies that have raised
    0:14:17 a ton of money, and then like they,
    0:14:20 when you have a ton of money, you spend a ton of money, right?
    0:14:23 Now there’s other companies like the Jet.coms
    0:14:24 that, you know, they made it work.
    0:14:28 So, you know, I’m not convinced it’s the worst strategy ever,
    0:14:29 but I’m not convinced it’s the best.
    0:14:32 But for me personally, you know,
    0:14:34 where it’s an execution risk business,
    0:14:37 I’m too rich to like, fuck around with the like,
    0:14:39 you know, okay, I’m just gonna do a seed round,
    0:14:42 and like, and just like, take a long time, right?
    0:14:45 Like for me, speed to market and execution
    0:14:47 was really important, and I felt also like
    0:14:48 this market really supported and required it,
    0:14:51 because, you know, it is the legal space,
    0:14:54 there is, you know, so a lot of value
    0:14:57 in making sure that the clients and the talent,
    0:15:00 the attorneys talent, and other legal providers,
    0:15:03 like, think this company’s gonna be around, right?
    0:15:05 So that’s very important.
    0:15:06 And so, you know, this is a strategy,
    0:15:08 I don’t necessarily recommend to anyone
    0:15:11 who can raise a ton of money that this is the right strategy,
    0:15:13 it’s just the strategy that we picked.
    0:15:16 In terms of the tactics of like how we did our round,
    0:15:18 especially our Series A, you know, my idea was,
    0:15:22 because VCs are kind of naturally adjacent to legal,
    0:15:25 right, like, for example, when you fund a company,
    0:15:27 they need someone to help them on the legal side
    0:15:30 with all of the, you know, paperwork
    0:15:32 and the execution of that funding round.
    0:15:35 We felt like getting a lot of VCs on our side
    0:15:37 would be a good tactic, and so we ended up going out
    0:15:40 and raising money for our seed round of, you know,
    0:15:44 over 90 different investors from all over Silicon Valley
    0:15:46 because I felt like it would be really good
    0:15:48 to have those investors on our side
    0:15:50 and recommending Atrium as a channel partner,
    0:15:54 and so that was our tactic there.
    0:15:56 – Right, and I think it’s something where,
    0:16:00 you know, to your point, if something
    0:16:03 that you’re working on is primarily execution,
    0:16:05 then that means that, you know, you can,
    0:16:08 there are times and places where you can use money
    0:16:10 to solve it versus, it seems like, you know,
    0:16:12 part of the market risk thing is it just, you know,
    0:16:14 to your point, it sort of lets you buy more lottery tickets,
    0:16:17 but it may not accelerate the process
    0:16:19 of actually doing it, right, and so I think that,
    0:16:21 that sort of feels like one main difference,
    0:16:23 and then the other one is, you know,
    0:16:25 that just to build on what you’re saying is that
    0:16:28 if you are in an industry where trustworthiness
    0:16:31 is really important, then being well-capitalized is key,
    0:16:33 you know, the same way you wouldn’t, you know,
    0:16:36 if you’re gonna, you know, for example,
    0:16:39 you know, build a FinTech startup where, you know,
    0:16:41 you’re gonna ask consumers to trust their money
    0:16:44 with you, you know, like you wanna be legit,
    0:16:46 you wanna be well-capitalized, you wanna have, like,
    0:16:48 you know, super strong executives
    0:16:50 and board members and investors, and like,
    0:16:52 and that’s a strategy, kind of, that’s a little bit,
    0:16:54 kind of, like, self-perpetuating as well.
    0:16:55 – Yeah, that’s a great example.
    0:16:57 If you’re gonna build, like, a new bank, right,
    0:16:59 like an online, or like a mobile-first bank,
    0:17:01 which is, I think, some people are doing,
    0:17:05 would you wanna raise, you know, $1 million,
    0:17:07 or $1 million seed round, or a $5 million seed round,
    0:17:11 or a $50 million series A out the gate?
    0:17:12 Obviously, if it’s available to you,
    0:17:15 you want more money, ’cause you know people need banks, right?
    0:17:17 It’s just a matter of can you do it better, right?
    0:17:19 That’s a perfect example, and there’s, you know,
    0:17:20 there’s many others here in Silicon Valley.
    0:17:22 I think we’ve actually shifted more
    0:17:25 as the cycle has changed over the last 10 years,
    0:17:26 the tech cycle.
    0:17:29 We’ve shifted more to these execution-risk startups,
    0:17:31 and so, you know, you’ve consequently seen,
    0:17:33 I mean, it’s a chicken and egg thing, really,
    0:17:35 what came first, but like, you’ve seen these more and more,
    0:17:37 like, bigger rounds, I’d say,
    0:17:39 that are supporting these companies.
    0:17:41 – Well, you know, one of my,
    0:17:44 one of the partners here, Chris Dixon,
    0:17:46 has talked about the idea that, you know,
    0:17:50 if you have a set of problems that has not been able
    0:17:52 to get solved and improved in, you know,
    0:17:55 the 20 years of the modern internet,
    0:17:59 then maybe all the techniques that we pioneered
    0:18:01 in the last, you know, decade plus,
    0:18:04 like being asset-light and just throwing software,
    0:18:06 and just shipping really quickly, and being really lean,
    0:18:09 like, maybe those techniques don’t work for a reason
    0:18:12 in healthcare and fintech and legal services
    0:18:14 in real estate, and like, you know,
    0:18:15 some set of those things, right?
    0:18:17 And so then, very quickly, you have to think,
    0:18:19 okay, well, you know, if those techniques don’t work,
    0:18:20 otherwise, it would have,
    0:18:21 someone would have tried it already,
    0:18:23 it would have, it would have sort of, you know,
    0:18:25 a little bit like efficient market hypothesis.
    0:18:26 – Yeah, that’s the thing.
    0:18:27 – It would have kind of like happened already,
    0:18:31 like, maybe you need a foundationally different approach.
    0:18:34 And so, I think that is actually one of the reasons why
    0:18:35 there’s more of these like, quote unquote,
    0:18:37 full-stack startups that are going after
    0:18:39 these really, really difficult areas.
    0:18:40 ‘Cause like, you know, otherwise,
    0:18:41 you wouldn’t be able to do it.
    0:18:44 – Yeah, well, running experiment right now.
    0:18:46 – Yes, yeah, right, no, I think that’s right,
    0:18:48 I think that’s right.
    0:18:50 So, you know, one of the fun topics,
    0:18:52 one thing that I have a tremendous amount of respect
    0:18:55 for you on is you’re very, you know,
    0:18:58 you’re always on the leading edge,
    0:19:00 thinking about, you know, self-improvement,
    0:19:03 how to sort of, you know, your own, you know,
    0:19:07 personal performance at work, you know, at home, et cetera.
    0:19:10 And obviously, one of the big things about, you know,
    0:19:12 running a company is that it is enormously stressful.
    0:19:13 Right? – Yeah.
    0:19:16 – And so, talk to us about like, you know,
    0:19:18 when you were a first-time entrepreneur,
    0:19:21 kind of Kiko, Justin TV, you know,
    0:19:22 how did you think about, you know,
    0:19:26 managing the stress of, you know, running a company?
    0:19:27 And what was your approach there?
    0:19:29 And then let’s talk about kind of like,
    0:19:30 the new and improved Justin now, you know,
    0:19:32 kind of 10 years later.
    0:19:33 – Yeah, so in the early days, you know,
    0:19:37 10 years ago, I was not doing anything
    0:19:39 in terms of like improving myself.
    0:19:44 In fact, I think I used to think about people’s attributes,
    0:19:47 maybe not your skills so much in terms of like, you know,
    0:19:48 it’s your skills at programming or stuff like that,
    0:19:50 but more of like your attributes.
    0:19:52 Like, I don’t know if you ever played Dungeons and Dragons,
    0:19:55 but when you create a character in Dungeons and Dragons,
    0:19:57 you roll this like 20-sided die and, you know,
    0:20:00 your strength, it’s like 14, your intelligence,
    0:20:02 you roll it and it’s six or whatever,
    0:20:04 and that’s what you have, you can’t change it.
    0:20:06 And so I felt like people’s attributes
    0:20:07 were kind of like that.
    0:20:09 And so I never worked on that very much,
    0:20:11 self-improvement stuff outside of, you know,
    0:20:12 like yeah, I became a better programmer
    0:20:14 ’cause we were programmed, you know,
    0:20:17 but I didn’t work on things to like make myself,
    0:20:20 I don’t know, smarter, right, or harder working,
    0:20:22 or like awake more hours of the day, right,
    0:20:25 like alert more hours a day or anything like that.
    0:20:26 So it was very happened, you know,
    0:20:28 everything was kind of accidental.
    0:20:30 Like we would, you know, I was not dealing with stress
    0:20:31 well at all.
    0:20:35 If there was a problem in the company,
    0:20:37 I would be very emotionally avoided to it,
    0:20:41 or I would like drown my sorrows in like alcohol, right,
    0:20:43 which is not a very good coping mechanism at all.
    0:20:46 And so, and then like, you know,
    0:20:47 in terms of even just down to like
    0:20:49 what we were like eating at lunch,
    0:20:51 I was, you know, we were talking about this this earlier,
    0:20:53 but I would, we would have like pizza every day
    0:20:56 at lunch at Justin TV, and then I’d like fall asleep.
    0:20:57 Like I’d go into like a carb comb,
    0:21:00 like a carb comb in the afternoon, like every day.
    0:21:04 And so now more recently, like with atrium,
    0:21:06 it’s been a tremendous vehicle
    0:21:08 for my own personal discovery,
    0:21:10 because I experienced the stress again.
    0:21:12 You know, I was like, oh my God, it’s stressful.
    0:21:16 Again, this is crazy, why is it so stressful?
    0:21:18 And so I started looking for ways to deal with that,
    0:21:21 and I really, I mean,
    0:21:24 if I had a number of things that started working for me
    0:21:26 after I started really exploring it,
    0:21:27 a number of things that started working for me
    0:21:31 starting last year, you know, everything from,
    0:21:33 friend of mine recommended a daily gratitude journal.
    0:21:35 So I’ve just been doing that every day,
    0:21:37 writing this gratitude journal, five minute journal.
    0:21:39 You write down the three things every morning
    0:21:40 that you’re grateful for.
    0:21:42 And that seems like a very simple thing
    0:21:44 and kind of hokey, actually, when I first heard about it.
    0:21:48 But what I realized was it helps recontextualize
    0:21:49 all the ups and downs that you experience
    0:21:51 as an entrepreneur, especially, I mean,
    0:21:53 the downs really, like throughout the day,
    0:21:54 they’re not as big of a deal,
    0:21:55 because in the morning, you’re writing down,
    0:21:58 like, wow, I have this opportunity.
    0:22:00 You know, I remember the day we came in and pitched,
    0:22:02 you guys here, I wrote my gratitude journal,
    0:22:05 I get to pitch, you know, Andrewson Horowitz.
    0:22:06 That’s amazing, right?
    0:22:09 Even whatever happens, that’s an incredible opportunity.
    0:22:13 It puts me in the top 0.01% of people in the world
    0:22:15 without opportunity, you know?
    0:22:18 So that was like pretty amazing.
    0:22:20 And then, you know, every day there’s something,
    0:22:21 like if you really think about it,
    0:22:24 there’s so many amazing things that happen to you
    0:22:27 as a human being here in Silicon Valley.
    0:22:29 Even just like actually one thing I think in the morning,
    0:22:31 oftentimes it’s like the supply chain,
    0:22:33 the global supply chain to deliver coffee
    0:22:35 from like around the world,
    0:22:38 so that I can grind up fresh coffee
    0:22:41 and like a pour over in my Chem-X in the morning.
    0:22:41 – Right.
    0:22:42 – That’s amazing.
    0:22:43 – It is amazing. – Like if you think about it.
    0:22:44 – It’s totally amazing. – Yeah.
    0:22:45 – Right.
    0:22:47 – So the gratitude journal really working for me.
    0:22:47 And then another thing–
    0:22:49 – You’re still eating pizza for lunch every day?
    0:22:52 – Yeah, so diet, another thing, I stopped eating pizza.
    0:22:54 – So, entirely?
    0:22:56 – No, well I started eating a ketogenic diet,
    0:22:58 which is a high-fat diet,
    0:23:01 but really the reason for me is like,
    0:23:05 I just don’t get as tired in the day anymore.
    0:23:06 And so that’s been really helpful.
    0:23:07 And last year I was experimenting,
    0:23:11 I did like some one meal a day, days,
    0:23:13 you know, like four days a week.
    0:23:15 Some of these weird like Jack Dorsey diets,
    0:23:17 kind of Jack Dorsey light or whatever,
    0:23:19 but it’s been good for me.
    0:23:21 You know, you had to do what your body feels like,
    0:23:21 it feels good.
    0:23:24 So, you know, that’s some diet and exercise,
    0:23:25 pretty religious about exercising,
    0:23:28 you know, try to do something every day during the work days.
    0:23:29 – Yeah.
    0:23:31 – And then the last thing that was really big
    0:23:33 is meditation for me.
    0:23:35 Start off just with headspace, you know,
    0:23:38 I’m not the type of person that people, you know,
    0:23:40 assume would be a heavy meditator
    0:23:42 or very introspective or anything like that.
    0:23:46 But for me, just like starting off with headspace last year,
    0:23:48 and then now I’ve been doing transcendental meditation,
    0:23:51 which is kind of what Ray Dahlio talks about
    0:23:53 and the principles, that’s worked really well for me
    0:23:58 to just be more present during the day in my life.
    0:24:02 It’s pretty amazing, amazingly profound effect.
    0:24:03 – I feel like your Twitter stream
    0:24:05 is part of your gratitude journal.
    0:24:06 – Yeah.
    0:24:07 – ‘Cause I read your Twitter stream and I’m like,
    0:24:09 oh, this is like very philosophical.
    0:24:11 You know, it’s not like just pulling out like,
    0:24:15 oh, here’s a blurb from the latest S1 or something.
    0:24:17 Like you’re like, you know, you’re like sharing ways
    0:24:20 that the startup community can, you know,
    0:24:21 to think about themselves.
    0:24:23 – Yeah, the way I think about it is like,
    0:24:25 if you want something to be part of your identity,
    0:24:26 talk about it.
    0:24:28 If you want to learn something, teach it.
    0:24:29 You know, and I really believe that.
    0:24:31 So for me putting out, you know,
    0:24:33 what I’ve been doing on the mindfulness side
    0:24:37 and for myself to be more emotionally kind of stable
    0:24:40 throughout the days and weeks and months,
    0:24:42 as a startup founder, that’s been really valuable to me.
    0:24:44 So if I can spread that to other people,
    0:24:46 it’s gonna help reinforce it as an identity for me,
    0:24:48 but it’s also gonna hopefully help those people as well.
    0:24:49 – Right.
    0:24:51 You know, one thing I’ve been working on a lot as well,
    0:24:54 the last thing I’ll say is just working on realizing
    0:24:56 that your attachment out in the very early days
    0:24:59 of Justin TV and Twit, Kiko and all these companies,
    0:25:03 I had a huge ego attachment to the outcome of the company.
    0:25:04 – Yeah.
    0:25:06 – Right, my identity and the companies
    0:25:07 were like very intertwined.
    0:25:10 And more recently, I realized that was the same
    0:25:11 at Atrium actually.
    0:25:13 Like again, I was like creating that same pattern,
    0:25:15 but it was like, it’s a very unhealthy pattern.
    0:25:18 So what I realized was I needed to start telling,
    0:25:21 reminding myself that no matter what happens
    0:25:24 with this company, I’m not gonna be any happier
    0:25:25 or any less happy in the long run.
    0:25:27 There might be a short burst of unhappiness
    0:25:30 if it fails or happiness if it succeeds amazingly,
    0:25:33 but you’re not gonna get any happier.
    0:25:35 Really, if you’re relying on outside things,
    0:25:38 external factors to drive your inner happiness,
    0:25:40 you will always be disappointed in the long run.
    0:25:43 And the funny thing is I’ve actually run the experiment,
    0:25:45 not an A/B test, but a linear experiment on that,
    0:25:48 because we’ve had more and more success over time.
    0:25:51 Like we started off really paying ourselves,
    0:25:54 I think with Kiko, $7,800 a month,
    0:25:57 but we only paid ourselves every other month.
    0:26:00 And so then we raised funding for Justin D.V.
    0:26:02 and we were able to make a little salary,
    0:26:05 then we became profitable, we made a lot more,
    0:26:07 and then we sold one company, then we sold a lot of companies.
    0:26:09 And so we kind of ramped over time.
    0:26:14 And after the basics of Maslow’s hierarchy of needs,
    0:26:17 we’re taking care of, I felt like I could go out to eat.
    0:26:18 I don’t know if that’s on one of those Maslow’s
    0:26:20 hierarchy of needs, but after that basic thing,
    0:26:22 I could afford to go out to eat.
    0:26:24 None of it never mattered.
    0:26:26 It never made me sustainably more happy.
    0:26:28 And so just reminding yourself of that
    0:26:30 and trying to remove those attachments in your mind,
    0:26:31 it’s easier said than done,
    0:26:34 but having that as like an active practice
    0:26:36 has been really important to me.
    0:26:38 – Let’s talk about mentorship as part of that, right?
    0:26:41 So when you’re building Kiko, Justin D.V.
    0:26:45 and you’re a first time entrepreneur,
    0:26:49 there’s a lot of experienced people
    0:26:52 who are kind of like, go ahead of you in the thing.
    0:26:55 And so that’s fantastic ’cause you can learn a ton.
    0:26:57 Would love to hear kind of who you thought of
    0:26:59 as your kind of lifelong mentors,
    0:27:01 people that have helped you for a long time.
    0:27:04 And then the other problem that’s super interesting is,
    0:27:06 then you start atrium and you’ve had
    0:27:08 some major successes behind you.
    0:27:11 And then in some ways, the number of people
    0:27:13 you can learn from is a much smaller pool
    0:27:16 and sort of like, how do you kind of curate your mentorship?
    0:27:17 Maybe how has it changed over time?
    0:27:20 That’s a broad topic, but would love to hear your opinion.
    0:27:21 – Sure, absolutely.
    0:27:23 So like the best part about Silicon Valley, in my opinion,
    0:27:26 is that there are people here who have done it before
    0:27:27 who are willing to help you.
    0:27:30 That we would never have made it here even day one
    0:27:35 without people who helped us when it was like economically
    0:27:37 bad, waste of time.
    0:27:40 We weren’t like, didn’t look like a hot prospect
    0:27:41 company or anything, right?
    0:27:44 So those people will, Paul Graham, great example,
    0:27:45 founder of Y Combinator, Paul Buhay,
    0:27:47 who is like a partner in Y Combinator,
    0:27:50 but they invented Gmail inside of Google.
    0:27:52 These are people who invested in us very early on,
    0:27:54 mentored us very early on and helped us out.
    0:27:57 And that was pretty amazing.
    0:28:01 And really that ethos perpetuates itself,
    0:28:03 because then as a partner in Y Combinator,
    0:28:05 even today as an entrepreneur,
    0:28:08 where there’s like lots of conflicting,
    0:28:09 competing interests for my time,
    0:28:12 I always make sure to spend a little bit of time
    0:28:14 mentoring other startups,
    0:28:16 because it’s kind of like a pay it forward thing.
    0:28:19 And you do the thing that people did for you
    0:28:21 when you were younger.
    0:28:25 – One of the really obvious sources of mentorship
    0:28:26 is actually peer mentorship, right?
    0:28:28 And some of the folks that kind of came up with you
    0:28:32 at the same time and ended up running
    0:28:34 really interesting companies on their own.
    0:28:37 And I know they all live in DeBose with you
    0:28:39 in San Francisco.
    0:28:40 Talk to us about some of your friends
    0:28:42 that you consider your mentors.
    0:28:44 – Yeah, so it’s great to have friends
    0:28:46 who are kind of doing the same path in a lot of ways,
    0:28:48 but are one or two steps ahead of you.
    0:28:49 So it’s still a case.
    0:28:52 Obviously Emmett, my co-founder who’s still running Twitch,
    0:28:53 it’s over a thousand people.
    0:28:56 I think it’s like 1500 people or something like that.
    0:28:59 My brother, who’s COO of Cruise,
    0:29:01 co-founder of Cruise, the self-driving car company.
    0:29:02 And they’re like over a thousand people.
    0:29:05 And Steve, the founder of Reddit,
    0:29:08 they’re like 400 something people or whatever.
    0:29:11 So, you know, seeing what their problems are,
    0:29:13 you know, obviously the problems are always the same,
    0:29:14 actually the problems are always like,
    0:29:16 I don’t have the right alignment among my team
    0:29:19 and I don’t have the right executive team.
    0:29:21 It’s usually some variation of those things.
    0:29:24 But, you know, hearing it from the horse’s mouth
    0:29:28 is super helpful in terms of making decisions for me.
    0:29:31 And then, you know, even outside of DeBose,
    0:29:32 sometimes I venture into Soma
    0:29:33 and having some of those, you know,
    0:29:35 kind of early YC founders who have really made it,
    0:29:38 you know, like the Drew from Dropbox, for example,
    0:29:41 and just knowing like how do they think about
    0:29:44 every, all those things, executive hiring, et cetera,
    0:29:45 it’s been really helpful, helpful to me.
    0:29:48 So, luckily here at Silicon Valley,
    0:29:49 I have a lot of great resources.
    0:29:51 And the last thing I’ll shout out is,
    0:29:55 I have this new executive coach, Matt Mocharri,
    0:29:57 who’s like amazing.
    0:29:58 This guy is the guru.
    0:30:01 He’s, you know, mentors a lot of different fast-growing
    0:30:02 startups around here.
    0:30:04 I just talked to him for like a 360 thing.
    0:30:05 – Yes. – Yeah, yes.
    0:30:07 – This guy. – Yeah, that was great.
    0:30:09 – Incredible, highly recommend, can’t speak,
    0:30:11 he’s like changing my life.
    0:30:13 So, I feel like he’s an angel sent down from Heaven
    0:30:16 to teach me, finally, after 14 years,
    0:30:19 how to like manage a system.
    0:30:21 And I’ve learned a lot from him, so.
    0:30:21 – That’s great.
    0:30:23 – Those are kind of three sets, you know,
    0:30:26 executive coach, the peer mentors,
    0:30:28 and then kind of those early stage mentors
    0:30:29 that I had back in the day.
    0:30:30 – That’s great.
    0:30:33 And I know one of the topics that you must end up
    0:30:36 talking about often is that when you’re building something
    0:30:39 that has a little bit more just execution risk,
    0:30:42 you know, you’ve raised some real money
    0:30:44 to kind of get started.
    0:30:46 A lot of it ends up being sort of like organizational,
    0:30:47 complexity, company culture.
    0:30:51 I know this is the big, big, big, big area focus for you.
    0:30:53 And that’s something that is very different
    0:30:54 when you’re trying to build something for the long run
    0:30:56 versus when you’re kind of just trying to find
    0:30:59 product market fit, and it’s kind of like 10 people,
    0:31:00 and you’re just like, is this even going to make it?
    0:31:02 Like, let’s not even work on this.
    0:31:03 – Yeah.
    0:31:05 – So talk to us about kind of how your approach has changed
    0:31:07 on building the company and your leadership style.
    0:31:09 – That is a great question because it’s something
    0:31:10 I think a lot about.
    0:31:15 So I had never thought before a couple of months ago,
    0:31:18 and this may sound stupid in a way,
    0:31:20 but I’d never thought, what is the kind of company
    0:31:22 that I want to show up to work at?
    0:31:25 So 14 years later, finally thinking about it.
    0:31:28 But the real answer is like, when you’re a 22 year old,
    0:31:30 just starting your company, or you’re in Silicon Valley
    0:31:33 and you’re thinking funding rounds and exits,
    0:31:35 you’re always thinking, what’s the next milestone?
    0:31:37 Like, how do I just claw my way desperately?
    0:31:39 However, whatever it takes, how do I get to that
    0:31:40 next milestone?
    0:31:41 It’s do or die.
    0:31:43 And for some, you know, oftentimes it is do or die.
    0:31:46 You don’t have the luxury, oftentimes, of thinking
    0:31:48 about what kind of, or you feel like you don’t have
    0:31:50 the luxury of thinking about what kind of company
    0:31:51 you want to build culturally.
    0:31:54 And so, I started at Atrium actually very much
    0:31:56 in the same way, but like, what are the metrics milestones
    0:31:57 we want to hit?
    0:31:58 What’s the next metrics milestone?
    0:32:00 What do we need to get to for a Series B
    0:32:02 or a next round of funding?
    0:32:05 And so, the problem with that was that a year in,
    0:32:07 I realized, oh, shoot, I need to like,
    0:32:10 there are like a lot of things that I’ve neglected
    0:32:13 that are actually affecting our ability to execute.
    0:32:15 And the number one thing there was,
    0:32:18 what’s the culture going to be?
    0:32:21 People didn’t know, like, what’s the alignment aspect
    0:32:22 of like, what are we building?
    0:32:24 What kind of company are we?
    0:32:25 Who are we?
    0:32:26 What are we building?
    0:32:28 And then, what’s the culture?
    0:32:29 How are we being intentional about it?
    0:32:32 So, we did a lot to work on that,
    0:32:35 ran through a collaborative values process over a year ago
    0:32:36 where we brought the whole company together,
    0:32:38 figure out what we care about.
    0:32:40 And then more recently, I’ve been thinking about,
    0:32:43 after a lot of this self-work in terms of making myself
    0:32:46 feel kind of consistently good every day
    0:32:49 and move my attachments to the outcome,
    0:32:53 I’ve realized there’s a set of principles
    0:32:55 that I want to implement at the company.
    0:32:57 And that I think that execution will flow
    0:32:59 from those things, right?
    0:33:01 If we build a company that has a high empathy
    0:33:04 for each other, where we have care for each other,
    0:33:06 where people are very collaborative,
    0:33:10 where people feel like the locus of control
    0:33:12 for what’s happening is inside of them,
    0:33:13 instead of outside of them.
    0:33:16 Things are happening through them, not to them.
    0:33:19 I think that all the execution will actually flow from that.
    0:33:21 One of the things I never understood before,
    0:33:23 which I feel like I really understand now,
    0:33:26 is that saying that culture eats strategy, right?
    0:33:28 I felt like I had very good strategy with Atrium,
    0:33:31 but I forgot about culture in that first part of the company,
    0:33:34 and now I realized how important it is.
    0:33:37 So one of the things I’ll say that we’re doing
    0:33:39 is recently I read this book called
    0:33:41 “The 15 Commitments of Conscious Leadership,”
    0:33:42 which is an amazing book,
    0:33:46 but it’s really about building a certain type of company,
    0:33:49 what the authors call a conscious company.
    0:33:54 But I would centered around that locus of control question.
    0:33:56 Do you have radical responsibility
    0:33:59 for what’s going on in your life, in your company,
    0:34:00 regardless of who you are?
    0:34:03 And read that book, and I realized this is the type
    0:34:05 of company that I want to work at.
    0:34:08 It’s a company populated by team members
    0:34:11 who really believe in these principles.
    0:34:13 And so we’re kind of going through a process
    0:34:15 of trying to implement that at our company.
    0:34:18 And really culture is one of the highest.
    0:34:20 It went from something that I didn’t prioritize
    0:34:22 to my top priority.
    0:34:24 – Yeah, well, and I think it’s really interesting
    0:34:29 because you’d started Kiko out of school, right?
    0:34:33 And so unlike some folks who maybe,
    0:34:35 they go and they work at Google or Facebook or something,
    0:34:36 and they maybe have a template
    0:34:39 for the company culture they want to create,
    0:34:41 this is something that you kind of had to learn
    0:34:44 and adjust over many, many kind of company iterations
    0:34:46 of various companies that you built.
    0:34:48 – That’s right, we had never worked at a place
    0:34:51 with good culture or a culture, right?
    0:34:53 Because we had always worked at our own company,
    0:34:55 so we were just making it up as we went along.
    0:34:57 When you’re not intentional about your culture
    0:34:59 or type of company you want to be,
    0:35:02 then the culture ends up being the accidental collection
    0:35:05 of good and bad choices and personality quirks
    0:35:08 and good and bad behaviors that your founding
    0:35:12 and executive teams propagate, right?
    0:35:13 And it’s accidental, right?
    0:35:17 And often times there’s things that are not good behaviors,
    0:35:19 they get propagated culturally.
    0:35:22 And often times people justify it because they’re like,
    0:35:26 well, they conflate the correlation with causation, right?
    0:35:29 They’re like, because we, you know,
    0:35:34 or have behaviors where maybe we’re like a low empathy
    0:35:35 company, let’s say, but they don’t call it that.
    0:35:36 They’re just like, we make decisions
    0:35:38 based on like meritocracy, right?
    0:35:40 And they’re like, but the best idea is gonna win,
    0:35:43 but then maybe that’s like because that’s a behavior
    0:35:44 that they’ve propagated,
    0:35:46 but that might not be the real reason
    0:35:49 why they’ve actually been winning, right?
    0:35:51 I think a lot of companies in Silicon Valley
    0:35:55 kind of succeed despite their management actually,
    0:35:56 not because of it.
    0:35:59 And what I mean by that is like the idea was so good
    0:36:02 that a bunch of 25-year-olds could run the company, right?
    0:36:04 That core product market was so good,
    0:36:05 it was just a rocket ship
    0:36:07 and then people were just trying to hang on.
    0:36:09 Now, eventually I think they do figure it out,
    0:36:10 but often times in those early days,
    0:36:12 and I think it’s actually quite, you know,
    0:36:14 not intentional and often times not that good.
    0:36:15 – Yeah.
    0:36:18 You know, in our conversation today,
    0:36:20 we’ve talked about all the things that you’ve changed.
    0:36:21 – Yeah. – Right?
    0:36:24 You’ve changed, you know, from consumer to B to B,
    0:36:27 you’ve changed, you know, how fast you fundraise,
    0:36:30 you know, there’s been a lot of different changes.
    0:36:33 You know, is there anything that you feel like you,
    0:36:35 like there’s a core that you’re like,
    0:36:37 okay, there’s this thread that I’m trying to do the same
    0:36:39 between all the companies,
    0:36:41 or is it just really like iterating very quickly
    0:36:44 and, you know, you’re doing a lot that’s different?
    0:36:46 – Well, I think that core ethos of, yeah,
    0:36:50 iterating quickly, you know, that’s like a YC ethos,
    0:36:52 that’s something that we really did,
    0:36:54 carry on at the very beginning.
    0:36:59 So speed was, you know, something that’s pretty important.
    0:37:04 I think that really being helpful in the community,
    0:37:06 not just your startup, I mean, including your startup,
    0:37:08 but also the community of startups,
    0:37:09 that’s something that we learn,
    0:37:11 the behavior we learn from, you know, the early days of YC,
    0:37:14 and even like our community of friends who were founders,
    0:37:17 who all became successful, you know, helped each other out,
    0:37:19 and then now today, you know, it’s an ethos
    0:37:21 that would take the atrium to really build a company
    0:37:24 that’s, you know, kind of for startups, you know,
    0:37:26 by startups that helps out these, you know,
    0:37:28 fast-growing startups.
    0:37:30 So that ethos is probably something
    0:37:33 that’s pretty similar, you know, kind of similar
    0:37:35 to like what you guys have at Andreessen, right?
    0:37:38 Which is like, if we can, you know, be the most helpful
    0:37:40 in terms of providing these networks of services,
    0:37:43 that is something that is going to, you know,
    0:37:45 kind of pay dividends for us as a brand.
    0:37:47 You know, that’s something I believed personally,
    0:37:50 and then also at each room throughout my entire career.
    0:37:51 – Right, right.
    0:37:54 Yeah, I mean, as you know,
    0:37:56 one of the things that’s great about the Bay Area
    0:37:59 is that it ends up being this very long-running,
    0:38:02 relationship-driven place where, you know,
    0:38:05 you meet people like, I mean, we met like 10 plus years ago,
    0:38:07 right, and that’s the kind of interesting thing
    0:38:11 where there’s many, many cases where you can work together.
    0:38:14 And so, you know, focusing on value creation,
    0:38:16 as opposed to like, how am I going to try to position myself
    0:38:18 to like capture the most value?
    0:38:21 Like, I think that, you know, like that certainly runs,
    0:38:22 like I think it’s one of the very special things
    0:38:24 about the Bay Area. – Absolutely.
    0:38:26 If I think about where we’re at, you know,
    0:38:29 where all the people who I, you know, saw in the early days,
    0:38:32 like 13 years ago, all these different founders,
    0:38:33 you know, these two-person startups
    0:38:35 that won’t even anything, where they’re at,
    0:38:37 their company might not have succeeded,
    0:38:40 but they have like created some value here in Silicon Valley
    0:38:42 by being in that ecosystem, being helpful,
    0:38:44 and then, you know, maybe becoming an executive
    0:38:45 at someone else’s company,
    0:38:47 or becoming an investor, early-stage investor
    0:38:48 at another company that really worked.
    0:38:51 And so, there really is like a feeling
    0:38:55 of if you kind of get out what you put into this community,
    0:38:57 and that’s one of the things I really love about it.
    0:39:02 – How do you think about the idea that, you know,
    0:39:05 when you’re first getting started,
    0:39:07 you’re a first-time entrepreneur,
    0:39:09 there’s kind of like low expectations, right?
    0:39:11 ‘Cause you’re like, maybe this’ll work,
    0:39:11 maybe this won’t work,
    0:39:15 people’s expectations of you are like kind of low too,
    0:39:16 ’cause they’re kind of like, I don’t know,
    0:39:18 who knows, you know, Justin’s often in San Francisco
    0:39:20 doing this thing, he’s running around with a camera
    0:39:24 on his head, and it’s just kind of a fun thing.
    0:39:27 And then, now, several companies later,
    0:39:30 because, you know, also you’ve raised money,
    0:39:32 and because you’ve done a lot, et cetera,
    0:39:34 you know, the expectations must be higher.
    0:39:36 Like, how do you think about, you know,
    0:39:38 how do you think about those expectations
    0:39:39 managing your own, you know,
    0:39:41 your own expectations around that?
    0:39:46 – Well, I always think that every entrepreneur’s expectations
    0:39:49 for themselves are very exceedingly high, right?
    0:39:52 If you’re the type of person who was a PM or engineer
    0:39:53 at, you know, one of these fan companies,
    0:39:55 and then you’re like, I’m gonna start a startup
    0:39:57 ’cause I see other people doing it,
    0:39:59 then you don’t think, you don’t go into it thinking,
    0:40:01 well, I’m just gonna create like a whatever,
    0:40:04 something that’s like a nice small business, right?
    0:40:06 You go in being like, I’m gonna raise series A
    0:40:08 from Andreessen, and we’re gonna be, you know,
    0:40:10 this product that just goes,
    0:40:12 the next Snapchat or whatever.
    0:40:16 And so I think that, like, it’s always a battle
    0:40:20 against your own, like, the kind of devil
    0:40:22 on your shoulders telling you, you’re not good enough,
    0:40:24 you know, you’re not doing well enough,
    0:40:25 you could be doing better.
    0:40:27 And I think the way that you win that battle
    0:40:31 is by really internalizing and realizing
    0:40:36 that whatever happens, you’re gonna be fine.
    0:40:38 And you’re probably gonna be the same,
    0:40:40 you’re not gonna be happier or less happy, actually.
    0:40:43 And now most people do not actually successfully
    0:40:46 internalize that well enough, in my opinion.
    0:40:48 But it is true, I firmly believe it’s true,
    0:40:51 and it’s something that the sooner you start practicing
    0:40:54 that in your head, really, you know,
    0:40:58 feeling that and experiencing that in yourself internally,
    0:41:00 then the happier you’ll be.
    0:41:02 And that’s not to say that a lot of, you know,
    0:41:04 people I come into contact with, like, friends even,
    0:41:06 or people who work for me are like,
    0:41:11 well, my drive, like, that need to win at all costs
    0:41:12 is my edge.
    0:41:14 But I really don’t believe, I mean,
    0:41:15 maybe that’s true for other people,
    0:41:17 but for me, I never found that to be the case.
    0:41:21 It’s like, no, it was just like the kind of unhappiness
    0:41:23 that was created around it that would make it
    0:41:25 actually less sustainable for me to continue on,
    0:41:27 because I was always, you know, you can’t,
    0:41:28 human beings don’t wanna live in a high stress,
    0:41:30 high anxiety, stay for too long.
    0:41:31 That’s how you can burn out, right?
    0:41:35 So startups are not, I mean, contrary to popular belief,
    0:41:37 it’s not a sprint, it’s a marathon, right?
    0:41:39 There’s these overnight successes,
    0:41:42 Twitch came out of nowhere, eight years from
    0:41:44 incorporating that company to selling it,
    0:41:45 almost exactly eight years, right?
    0:41:47 So that is a long time.
    0:41:49 And in order to last a long time,
    0:41:52 you need to figure out a way that you are okay
    0:41:54 with what’s going on, and what’s going on
    0:41:57 is always gonna involve bad things.
    0:41:59 Like things, you know, there’s gonna be good
    0:42:00 and there’s gonna be bad.
    0:42:01 So if you don’t figure out a way
    0:42:04 that you psychologically are okay with that,
    0:42:06 you’re gonna give up, and if you give up,
    0:42:09 you’re never gonna see it to the ultimate potential
    0:42:12 that whatever your startup is can be.
    0:42:12 – Great.
    0:42:18 I’m gonna throw in kind of two last questions in here.
    0:42:23 One is, you know, what are you reading these days?
    0:42:25 Do you have any sort of recommendations, podcasts
    0:42:27 that you like, kind of media consumption
    0:42:30 is kind of a way to learn yourself?
    0:42:31 – Yeah.
    0:42:33 – And anything sort of especially super impactful
    0:42:37 over the last couple years that you wanna reference?
    0:42:38 – Yeah, that’s great.
    0:42:40 So reading a lot, actually,
    0:42:43 ’cause I deleted all the entertainment apps off my phone,
    0:42:45 including the browser, and I’d locked it
    0:42:47 so that I don’t can install new apps
    0:42:48 ’cause I was a total phone addict.
    0:42:49 That includes Twitch.
    0:42:50 – Wait, how do you lock?
    0:42:51 How do you lock your–
    0:42:51 – So you can lock installing new apps,
    0:42:52 you can delete the app store
    0:42:54 and put a passcode lock on it.
    0:42:55 – Oh.
    0:42:56 – And I gave my wife the passcode,
    0:42:58 or she put in a passcode, I don’t know what it is.
    0:42:59 – Right.
    0:43:00 – So I can’t–
    0:43:01 – This is like a parental lock.
    0:43:02 – It’s a parental lock, exactly.
    0:43:04 So I don’t have control over my own phone anymore.
    0:43:07 But the consequence of that is I read a lot more books,
    0:43:09 which is good.
    0:43:12 A couple of ones that have been particularly impactful to me.
    0:43:15 Number one is this book called The Untethered Soul.
    0:43:17 So this book changed my life.
    0:43:20 It’s really about the idea is that you are not
    0:43:22 the thing that you think you are.
    0:43:25 Most people go through life thinking they are the experiences
    0:43:27 that they’ve experienced, the thoughts that they have,
    0:43:29 or the emotions that they have.
    0:43:33 But really, you’re just the observer of these things
    0:43:34 that are happening.
    0:43:36 And by creating, it’s almost like you’re watching a movie,
    0:43:39 right, like a movie that has all the five senses,
    0:43:40 plus emotions, plus thoughts.
    0:43:45 So like a seven dimensional movie of the just end life, right?
    0:43:49 And I think that was a very important message to me
    0:43:52 to realize, like an internalize that like actually
    0:43:55 these attachment, these things that I think will happen
    0:43:57 that will like drive happiness, like experiences or events
    0:43:59 or whatever will never actually drive
    0:44:01 true internal happiness.
    0:44:04 So amazing book, The Untethered Soul, highly recommend it.
    0:44:07 Another book that I think made me a much better leader,
    0:44:10 I was this book I read called Leadership and Self Deception.
    0:44:12 Amazing book.
    0:44:14 The book, the premise of the book is really there’s two ways
    0:44:17 to treat other people inside the box and outside the box,
    0:44:21 but really I mean like without empathy or with empathy,
    0:44:25 like treating them like an object or a person, right?
    0:44:28 And the idea, the fundamental idea behind the book
    0:44:31 is if you treat people like an object,
    0:44:33 number one, they don’t like it, right?
    0:44:36 But the second thing is, if you treat people like objects
    0:44:38 who are just there to fulfill something for you, right?
    0:44:40 Like at work, it would be like to, you know,
    0:44:43 hit some number or metric or whatever, do some job.
    0:44:46 The problem is that you will lie to yourself about
    0:44:50 when there’s negative situations about what your role is,
    0:44:51 you’ll self deceive, you’ll say,
    0:44:53 oh, this person is 100% at fault
    0:44:55 and I’m 0% at fault.
    0:44:58 And I found myself actually doing that a lot of times.
    0:45:02 I realized, you know, I felt like I was,
    0:45:07 you know, had a high degree of empathy for other people,
    0:45:09 but I realized that was for people who,
    0:45:10 I felt like we’re performing really good
    0:45:13 where my observation was like high performance.
    0:45:16 But I wasn’t, where my observation,
    0:45:19 where my feeling was that there wasn’t high performance,
    0:45:23 I felt like I would slip into this like low empathy mode.
    0:45:25 And the problem is that when you’re in that mode,
    0:45:27 you don’t admit, what are the things that I have done?
    0:45:29 What, you know, Justin, what are the things I have done
    0:45:30 to contribute to that situation?
    0:45:34 So examples could be put someone in the wrong job,
    0:45:37 didn’t give them clear enough criteria for success or failure,
    0:45:39 didn’t support them with the right resources, right?
    0:45:42 There’s many reasons why I could have contributed
    0:45:44 to some situation failing.
    0:45:47 And I found that like I would lie to myself in those situations.
    0:45:51 So, you know, that was a really important book for me.
    0:45:53 And those are probably two ones I really recommend.
    0:45:55 – That’s great.
    0:45:57 Are you doing a lot of podcasts right now?
    0:45:59 – Listening to them? – Yeah.
    0:46:01 – Been listening to– – That kind of entertainment.
    0:46:03 – Yeah, I listened to some podcasts.
    0:46:06 I’ve been listening to some of the Joe Rogan experience.
    0:46:09 – Nice. – And that’s probably the,
    0:46:10 this is probably it.
    0:46:12 I like, I just listened to one with Alex Honol,
    0:46:15 where he’s talking about climbing, you know,
    0:46:16 so it’s pretty interesting.
    0:46:20 – I’m gonna ask you the time travel question.
    0:46:24 So if you were today, an enlightened, repeat entrepreneur,
    0:46:27 to go back to yourself when, you know,
    0:46:30 you’re 22, 23, just getting started doing this thing.
    0:46:34 You know, what advice would you give yourself?
    0:46:37 – When I’m just getting started, you know,
    0:46:40 probably join Facebook.
    0:46:45 No, but the, maybe the real answer,
    0:46:47 I’ve actually, you know, I don’t really regret
    0:46:49 any of the, like, economic choices or anything.
    0:46:52 I think I’ve had this tremendous opportunity
    0:46:55 to, like, build and discover these new things
    0:46:59 and build companies and I wouldn’t trade it for anything.
    0:47:01 I think the thing I could have done better
    0:47:03 or, like, learned, you know, back then is,
    0:47:07 you know, self-improvement is a thing.
    0:47:08 You should probably, like, work on that.
    0:47:09 Maybe that’s number one.
    0:47:10 The second thing would be–
    0:47:11 – Stop eating pizza at work.
    0:47:12 – Yeah, stop eating pizza at work.
    0:47:14 Number three would be,
    0:47:18 you should, you know, things take time.
    0:47:20 Like, don’t be in such a, you know,
    0:47:22 it’s not, like, a one-year, one-and-done,
    0:47:24 like, no, you’re a billionaire, you know?
    0:47:26 Like, if you look at the, let’s say,
    0:47:29 any sort of, like, the Amazon share price
    0:47:30 or, like, market cap over time, right?
    0:47:32 It looks, like, even through, like,
    0:47:36 the last couple years looks like an exponential curve, right?
    0:47:39 And so, you know, if Bezos had been, like, your,
    0:47:41 you know, like, 15,
    0:47:44 which is a long time to be starting a company,
    0:47:46 like, oh, man, I made it, I’m done,
    0:47:47 like, I’m gonna retire, like,
    0:47:50 well, you know, the company would look a lot different,
    0:47:52 right, so, you know, things take time
    0:47:54 and I have to constantly remind myself that.
    0:47:57 I think humans, you know, here in Silicon Valley,
    0:47:59 especially, but then human beings in general,
    0:48:01 are wired to, like, always want the new,
    0:48:02 you know, look for the new thing.
    0:48:03 What’s new?
    0:48:05 What are you, like, what’s your new thing?
    0:48:07 Something that everybody’s asking about in Silicon Valley?
    0:48:10 They’re always, you know, that’s a question here.
    0:48:12 But the best entrepreneurs here,
    0:48:13 the ones who have created lasting companies
    0:48:16 and lasting value, they stick with their thing
    0:48:19 for decades, you know?
    0:48:21 And that’s what impresses me most now
    0:48:25 and I wish that I had kind of realized that, you know,
    0:48:25 before.
    0:48:28 – Awesome.
    0:48:29 – Better late than never.
    0:48:30 (laughing)
    0:48:31 – Awesome.
    0:48:33 Justin, thank you for coming by.
    0:48:33 – Yeah.
    0:48:35 – Thanks for having me. – Really good discussion.
    0:48:45 [BLANK_AUDIO]

    Want actionable advice from a founder who has built multiple tech companies and has invested the time to be open, introspective, and transparent about lessons learned?

    In this episode (which originally aired as a YouTube video), a16z General Partner Andrew Chen (@@andrewchen) talks with Justin Kan (@justinkan). Justin is a repeat entrepreneur who co-founded Kiko Software (a Web 2.0 calendar that pre-dated Google Calendar by 4 years); Justin.tv (a lifecasting platform); Twitch.tv (a live streaming platform for esports, music, and other creatives now part of Amazon); Socialcam; and now Atrium, a software-powered law firm for startups.

    Justin reflects on his journey and shares 10 + 1 lessons he’s learned:

    • The paradox of choice: choosing a focus
    • Tradeoffs between B2B versus B2C companies
    • Market risk vs execution risk
    • Fundraising strategy: go big or stay lean?
    • Managing the stress of being a startup CEO (again!)
    • Seeking out mentors, coaches, and peers for help
    • Intentionally designing a culture to avoid the pitfalls of “culture eating strategy”
    • Things he’s still doing in his latest startup—and things he’s doing very differently
    • Managing higher expectations
    • What he’s reading and listening to
    • Bonus: advice he’d give his 20-year old self

    The views expressed here are those of the individual AH Capital Management, L.L.C. (“a16z”) personnel quoted and are not the views of a16z or its affiliates. Certain information contained in here has been obtained from third-party sources, including from portfolio companies of funds managed by a16z. While taken from sources believed to be reliable, a16z has not independently verified such information and makes no representations about the enduring accuracy of the information or its appropriateness for a given situation.

    This content is provided for informational purposes only, and should not be relied upon as legal, business, investment, or tax advice. You should consult your own advisers as to those matters. References to any securities or digital assets are for illustrative purposes only, and do not constitute an investment recommendation or offer to provide investment advisory services. Furthermore, this content is not directed at nor intended for use by any investors or prospective investors, and may not under any circumstances be relied upon when making a decision to invest in any fund managed by a16z. (An offering to invest in an a16z fund will be made only by the private placement memorandum, subscription agreement, and other relevant documentation of any such fund and should be read in their entirety.) Any investments or portfolio companies mentioned, referred to, or described are not representative of all investments in vehicles managed by a16z, and there can be no assurance that the investments will be profitable or that other investments made in the future will have similar characteristics or results. A list of investments made by funds managed by Andreessen Horowitz (excluding investments and certain publicly traded cryptocurrencies/ digital assets for which the issuer has not provided permission for a16z to disclose publicly) is available at https://a16z.com/investments/.

    Charts and graphs provided within are for informational purposes solely and should not be relied upon when making any investment decision. Past performance is not indicative of future results. The content speaks only as of the date indicated. Any projections, estimates, forecasts, targets, prospects, and/or opinions expressed in these materials are subject to change without notice and may differ or be contrary to opinions expressed by others. Please see https://a16z.com/disclosures for additional important information.

  • a16z Podcast: Deep Learning for the Life Sciences

    AI transcript
    0:00:02 Hi, and welcome to the A16Z podcast.
    0:00:03 I’m Hannah.
    0:00:05 Deep learning has come to the life sciences.
    0:00:09 Lately it seems every week a published study comes out with code on top.
    0:00:14 In this episode, A16Z general partner on the bio fund Vijay Pandey and Bart Remsundar talk
    0:00:17 about how AI and ML is unlocking the field in a new way.
    0:00:22 In a conversation around their recently published book, Deep Learning for the Life Sciences,
    0:00:25 written along with co-authors Peter Eastman and Patrick Walters.
    0:00:30 The book aims to give developers and scientists a toolkit on how to use deep learning for
    0:00:35 genomics, chemistry, biophysics, microscopy, medical analysis, and other areas.
    0:00:36 So why now?
    0:00:40 What is it about ML’s development that is allowing it to finally make an impact in this
    0:00:41 field?
    0:00:43 And what is the practical toolkit?
    0:00:44 The right problems to attack?
    0:00:46 The right questions to ask?
    0:00:51 Above and beyond that, as this deep learning toolkit becomes more and more accessible, biology
    0:00:54 is becoming democratized through ML.
    0:00:57 So how is the hacker ethos coming to the world of biology?
    0:01:01 And what might open source biology truly look like?
    0:01:05 So Bart, we spent a lot of time thinking about deep learning and life sciences.
    0:01:10 It’s a great time, I think, for people to become practitioners in this space, especially
    0:01:15 for people maybe that’s never done machine learning before from the life sciences side,
    0:01:18 or maybe people from the machine learning side to get into life sciences.
    0:01:22 But maybe the place to kick it off is what’s special about now?
    0:01:23 Why should people be thinking about this?
    0:01:28 The challenge of programming biology has been that we don’t know biology, and we make up
    0:01:33 theoretical models, and the computers are wrong, and biologists and chemists understandably
    0:01:36 get grumpy and say, “Why are you wasting my time?”
    0:01:40 But with machine learning, the advantage is that we can actually learn from the raw data.
    0:01:43 And all of a sudden, we have this powerful new tool there.
    0:01:45 It can find things that we didn’t know before.
    0:01:50 And this is why it now is the time to get into it, really to enable that next wave of breakthroughs
    0:01:51 in the core science.
    0:01:57 The part that still blows me away is just how fast this field is moving, and it feels
    0:02:03 like it’s a combination of having the open source code on places like GitHub and Archive,
    0:02:07 and there’s a paper or a week that’s impactful when it used to be maybe a paper or a quarter
    0:02:09 or a paper a year.
    0:02:13 And the fact that code is coming with the paper, it’s just layering on top.
    0:02:17 That seems to me to be the critical thing that’s different now.
    0:02:21 I think when you can clone a repo off GitHub, you also don’t have new insights just because
    0:02:23 I’m using a new language.
    0:02:27 And now that thousands of people are getting into it, I think all of a sudden you’ll find
    0:02:32 lots of semi-self-taught biologists who are really starting to find new, interesting things.
    0:02:33 And that is why it’s exciting.
    0:02:38 It’s like the hacker ethos, but kind of coming into the bio world, which has typically been
    0:02:40 much more buttoned down now.
    0:02:44 I think anyone who can clone a repo can start really making a difference.
    0:02:47 I think that’s going to be where the real long-term impact arises from these types
    0:02:48 of efforts.
    0:02:53 You don’t need a journal subscription to get archive or to get the code, which is actually
    0:02:54 that alone is kind of amazing.
    0:02:59 It wasn’t that long ago where a lot of academics offer was sold, and it was maybe sold for
    0:03:01 $500, which is very material.
    0:03:02 That’s one piece.
    0:03:08 You connect that to the concept of now AI or ML can unlock things in biology.
    0:03:12 Then biology is becoming democratized as kind of your point.
    0:03:17 And so let’s talk about that because we’re still learning biology collectively.
    0:03:20 What is it about deep learning in biology now?
    0:03:22 Because biology’s old, machine learning is old.
    0:03:23 What’s new now?
    0:03:26 Deep learning has this question all over the place.
    0:03:27 Why does it work now?
    0:03:30 The first neural nets kind of popped out in the 1950s.
    0:03:32 And I think it’s really a combination of things.
    0:03:38 I think that part of it is the hardware, really, the hardware, the software, the growth of kind
    0:03:42 of rapid linear algebra stacks that have made it accessible.
    0:03:47 I think also an underappreciated part of it is the growth of the cloud and the internet
    0:03:48 really.
    0:03:51 Neural nets are about as janky now as it used to be in the ’80s.
    0:03:55 The difference is that I can now pull up a blog post where someone says, “Oh, these things
    0:03:56 are janky.
    0:03:57 Here’s the 17 things I did.
    0:03:59 I can copy, paste that into my code.”
    0:04:01 And all of a sudden, I’m a neural net expert.
    0:04:02 It’s all quite that easy.
    0:04:07 It turns it to a tradecraft almost that you can learn by just working through it.
    0:04:09 That’s why the deep learning tool again has been accessible.
    0:04:14 Then you get to biology, and the question is why biology, why now?
    0:04:17 And I think you’re actually the question’s a little deeper.
    0:04:21 I think that it’s really about, I think, representation learning.
    0:04:27 So we have now reached this point where I think we can learn representations of molecules
    0:04:28 that are useful.
    0:04:33 This has been something that in the science of chemistry, we’ve been doing a long time.
    0:04:38 There’s been all sorts of hand-encoded representations of parts of molecular behavior that we think
    0:04:39 are important.
    0:04:44 But I think now using the new technology from image processing, from word processing, we
    0:04:47 can begin to learn molecular representations.
    0:04:50 To be fair, I actually don’t think we’ve really broken through there.
    0:04:55 If you look at what’s happening in images or text, there are five years ahead of us.
    0:05:00 Well, let me break in here because just for the listeners to give a sense for why representation
    0:05:05 is important, and one of my pet examples is that if I gave anybody, say, two five-digit
    0:05:07 numbers to add, it’d be trivial.
    0:05:12 If I gave you those same five-digit numbers in Roman numerals and you wanted to add them,
    0:05:14 the representation there would make this insane.
    0:05:15 And what would you do?
    0:05:21 Well, you would convert into appropriate representation where the operations are trivial or obvious.
    0:05:26 And then the operation is done, and maybe it re-encodes, auto-encodes back to the other
    0:05:27 representation.
    0:05:28 So this is the problem.
    0:05:32 It’s like when you have a picture, representations are obvious because it’s pixels, and computers
    0:05:34 love pixels.
    0:05:39 And maybe even for DNA, DNA is like a one-dimensional image, and so you have bases that are kind
    0:05:40 of like pixels.
    0:05:44 We used to joke early days that we would just take a photograph with a small molecule and
    0:05:46 then use all the other stuff, but that’s kind of insane too.
    0:05:51 And so with the right representation, things become transparent and obvious with the wrong
    0:05:53 representation becomes hard.
    0:05:54 This is really at the heart of machine learning.
    0:05:59 It’s that there’s something about the world that I want to compute on, but computers only
    0:06:06 accept very limited forms of input, zero ones, tack strings, like simple structures.
    0:06:11 Whereas if you take a molecule, a molecule is like a frighteningly complex entity.
    0:06:16 So one thing that we often don’t realize is that until 100 years ago, we barely had any
    0:06:17 idea what a molecule was.
    0:06:23 It’s this alarmingly strange concept that although we see little diagrams in 10th grade
    0:06:26 chemistry or whatever, that isn’t what a molecule is.
    0:06:31 It’s a much weirder, weirder quantum object, dynamic, kind of shifting, flowing.
    0:06:33 We barely understand it even now.
    0:06:37 So then you just really start asking the question of what is water, for example?
    0:06:40 Is it the three characters, H2O?
    0:06:43 Is it two hydrogens and oxygen?
    0:06:45 Is it some quantum construct?
    0:06:47 Is it this dynamic vibrating thing?
    0:06:49 Is it this bulk mass?
    0:06:52 There’s so many layers to kind of the science of it.
    0:06:55 So what you really want to do is you’ve got to pick one, and this is where it gets really
    0:06:56 hard, right?
    0:07:01 Like, if I’m thirsty, what I care about in water is a glass of water.
    0:07:06 If I’m trying to answer deep questions about the structure of Neptune, I might want a slightly
    0:07:08 different representation of water.
    0:07:14 The power of the new deep learning techniques is we don’t necessarily have to pick a representation.
    0:07:17 We don’t have to say water is X or water is Y.
    0:07:22 Instead, you say, let’s do some math, and let’s take that math and let the machine really
    0:07:27 learn the form of water that it needs to answer the question at hand.
    0:07:32 So one form of mathematical construct is thinking of a molecule as a graph.
    0:07:37 And if you do this, you can begin to do these graph-deep learning algorithms that can really
    0:07:41 extract meaningful structure from the molecule itself.
    0:07:46 We’ve learned, finally, that here’s a general enough mathematical form we can use to extract
    0:07:52 meaningful insights about molecules or these critical biological chemical entities that
    0:07:56 we can then use to answer real questions in the real world.
    0:08:00 What I think is interesting here in particular is that so much has been developed on images,
    0:08:03 and there’s a lot of biology that’s images, and so we could just spend the whole time
    0:08:08 talking about images, and it could be microscopy or radiology and tons of good stuff there.
    0:08:12 But there’s a lot of biology that’s more than images, and molecules is a good example.
    0:08:16 For a long time, it seemed like deep learning was being so successful in images that that’s
    0:08:17 all it really did.
    0:08:23 And if you could take your square peg and put in whatever holes you got, it would work.
    0:08:26 What you’re talking about for graphs is kind of an interesting evolution of this, because
    0:08:30 a graph and an image are different types of representations.
    0:08:35 But at a technical level, convolutional networks for images or graph convolutions for graphs
    0:08:39 are kind of a sort of borrowing a concept at a higher level.
    0:08:44 The biology version of machine learning is starting to sort of grow up and starting to
    0:08:49 not just be a direct copy of what was done with images and in other areas, but now starting
    0:08:50 to be its own thing.
    0:08:55 A five-year-old can really point out the critical points in an image, but you almost
    0:08:58 need a PhD to understand the critical points of a protein.
    0:09:04 So you have this like dual kind of weights, a burden of understanding, so it’s taken
    0:09:09 a while for the biological machine learning approach to really mature because we’ve had
    0:09:13 to spend so much time even figuring out the basics.
    0:09:18 But now we’re finally at this point where it feels like we are diverging a little bit
    0:09:23 from the core trunk of what people have done for images or text.
    0:09:26 In another five years, I’m going to be blown away by what this thing does.
    0:09:28 It’s going to understand more deeply.
    0:09:35 So we kind of have this sort of connection between democratization of ML, ML into biology,
    0:09:38 democratization into biology, but I don’t think we’re there yet.
    0:09:42 I think for ML, I think there really is a sense of democratization.
    0:09:49 You could code on your phone and do some useful things or certainly on a laptop, a cheap laptop.
    0:09:51 But for biology, what is missing?
    0:09:53 One is data, and there’s a fair bit of data.
    0:09:58 In the book, we talk about the PDB, we talk about other data sets, and there are publicly
    0:10:02 available data sets, but somehow that doesn’t get you into the big leagues.
    0:10:06 So like if in this vision of democratizing biology, what’s left to be done?
    0:10:12 In some ways, the democratization of ML is a teensy bit of an illusion even.
    0:10:17 It’s because that the core constructs were mathematically invented, that there is this
    0:10:24 convolutional neural net or its cousins, the LSTM or the other forms of core mathematical
    0:10:28 breakthroughs that have been designed, that you can take these building blocks and just
    0:10:30 apply them straight out.
    0:10:35 In biology, as you pointed out earlier, I think we don’t have those core building blocks
    0:10:36 just yet.
    0:10:41 We don’t know what the LEGO pieces are that would enable a newcomer to really start to
    0:10:44 do breakthrough work.
    0:10:45 We’re closer than we were.
    0:10:49 I think we’ve had the beginnings of a toolbox, but we’re not there yet.
    0:10:53 Let’s think about what happened on the ML side as inspiration for the Bio side.
    0:10:54 How much is it driven through academia?
    0:10:56 How much driven through companies?
    0:10:59 Because what I’m getting at is that there’s a lot of IO in academia.
    0:11:02 I don’t know if we’re seeing that being made open sourced in companies.
    0:11:07 We’re getting to this really weird set of influences where in order for companies to
    0:11:09 gain influence, they need to open source.
    0:11:14 This is why 10 years ago, I can’t imagine that Google would have open sourced TensorFlow.
    0:11:20 It would have been core proprietary technology, but now they know that if they don’t do that,
    0:11:24 developers will shift to some other platform by some other company.
    0:11:25 Exactly.
    0:11:30 It’s weird that the competitive market forces are driving democratization.
    0:11:35 Most of high torch basically are Facebook-based and TensorFlow is from Google.
    0:11:37 Let’s say Google kept TensorFlow proprietary.
    0:11:39 What would be so bad for them if they did that?
    0:11:41 What if everybody outside used high torch?
    0:11:45 I think there’s a really neat analogy to the financial sector.
    0:11:50 A lot of financial banks have masses of functional programs that they keep under the hood, under
    0:11:51 the covers.
    0:11:55 If you look at Jane Street, or I believe Standard and Chartered, or a few other of these other
    0:12:00 big institutions, lots and lots of functional code hiding behind those walls.
    0:12:04 But that really hasn’t really infiltrated further out.
    0:12:09 This actually, I think, in the long run weakens them because it’s harder to train, it’s harder
    0:12:12 to find new talent, it’s more specialized.
    0:12:17 A lot of the code base at Google is proprietary, like the original MapReduce was never put
    0:12:18 out there.
    0:12:22 This I think has actually caused them a little bit of a problem in that new developers coming
    0:12:27 in have to spend months and months and months getting up to speed with the Google stack,
    0:12:32 whereas if you look at TensorFlow, it doesn’t take any time at all, someone could walk in
    0:12:34 and basically be able to write TensorFlow.
    0:12:36 They’ve been using it for months to years.
    0:12:37 Exactly.
    0:12:42 And I think at the scale that Big Tech is at, this is just like, it’s a powerful market
    0:12:43 advantage.
    0:12:45 They’re almost outsourcing their education process.
    0:12:48 And I guess if they don’t put it out, someone else will, and then they’ll learn on their
    0:12:49 platform.
    0:12:52 Yes, but then maybe what is the missing part in biology?
    0:12:57 We’ve got pharma, a huge force there, but they have very specific goals.
    0:13:02 A lot of agricultural companies, but it’s much more disparate.
    0:13:08 Yeah, it’s dramatically hard to actually take an existing organization and turn it into
    0:13:10 an AI machine learning organization.
    0:13:17 So one thing I’ve honestly been surprised by is that when I’ve seen companies or organizations
    0:13:22 I know try to incorporate AI into their drug discovery process, it ends up taking them
    0:13:27 years and years and years, because they’re fighting all these upstream battles, weeks
    0:13:32 to get their old computing system to upgrade to the right version of their numerical library
    0:13:35 so they could even install TensorFlow.
    0:13:41 And then they had all these things about who can actually say, upgrade the core software,
    0:13:44 who is it this department?
    0:13:47 How much do they need to talk to the biologists, to the chemists?
    0:13:52 And the fact is that pharma and existing big codes are not built this way.
    0:13:57 That’s not their core expertise, whereas if you look at Facebook or Google, they’ve been
    0:14:02 machine learning for almost two decades now, from the first AdWords model.
    0:14:07 So in some sense, they had to change very little about their culture, like, yeah, there’s
    0:14:11 a slight difference instead of this function, use that function, but whatever.
    0:14:15 But the core culture was there, and I think the culture, the people, changing that is
    0:14:20 going to be dramatically hard, which is why I think it will really take, I think, ten
    0:14:24 years and a generation of students who have been trained in the new way to come in and
    0:14:25 shift it.
    0:14:26 Yeah.
    0:14:27 Well, Google was a startup too, right?
    0:14:31 I think, you know, the thesis was that, and is that, that startups will be able to build
    0:14:32 a new culture.
    0:14:37 And I think the key thing that I think we’re seeing sort of boots on the ground is that
    0:14:41 culture has to be not, here’s your data scientists are machine learning people in one room and
    0:14:45 you’re biologists in another room, that they’d have to be the same.
    0:14:50 What’s intriguing to me is just the size of the bio market.
    0:14:55 Biology is healthcare, it’s agriculture, it’s food, it could be the future of manufacturing.
    0:14:59 There’s so many different places that biology plays a role to date and will play a role,
    0:15:02 but it just means that I think, I think to the point we’re talking about these companies
    0:15:06 just are being built right now.
    0:15:12 There’s I think this whole host of challenges here because biology is hard and building kind
    0:15:17 of like that selective understanding of like, you know, of the 10 best practices that existed.
    0:15:19 Five are actually still best practices.
    0:15:23 The other five we need to toss out a window and stick in a deep learning model.
    0:15:27 That kind of very painstaking process of experimentation and understanding.
    0:15:31 That I think is like where the really hard innovation is happening.
    0:15:32 And that’s going to take time.
    0:15:36 You’re never going to be able to replace like a world-class biologist with any machine learning
    0:15:37 program.
    0:15:43 A world-class biologist is typically fricking brilliant and they often bring a set of understanding
    0:15:46 that no programmer or no computer scientist can.
    0:15:51 Now, the flip side holds true and I think that merger, as you said, that’s where like
    0:15:53 there’s power for magic dynamism.
    0:15:57 One really interesting factoid I heard from an entrepreneur in the space is that, you
    0:16:03 know, the best biologists that they could hire had a market rate that was lower than
    0:16:10 a introductory, intermediate, you know, front-end dev and, you know, of course, front-end is
    0:16:11 very hard engineering.
    0:16:15 I don’t want to put that down, but there’s so many fewer of these biologists, so there’s
    0:16:21 almost this market imbalance of how is it possible that, you know, you can take really
    0:16:27 a world-class biologist of whom there’s maybe a couple of hundred in the world and not have
    0:16:29 them be valued properly by the market.
    0:16:32 So do you even out those pay scales in one company?
    0:16:37 Do you like have two awkward pay ladders that coexist and create tension in your company?
    0:16:41 These are the types of like really hard operational questions that almost have nothing to do with
    0:16:43 the science, but at the heart of it they do.
    0:16:45 Maybe it’s interesting to talk about like how we can help people get there.
    0:16:49 Yeah, so what’s like the training they should be doing, maybe we could even go like super
    0:16:50 nuts and bolts.
    0:16:52 So I got my laptop, what do I do?
    0:16:58 So I mean, like, I guess there’s a couple key packages we install, like TensorFlow, maybe
    0:17:00 DeepGam, something like that.
    0:17:04 Python is often already installed, let’s say on a Mac, is that it?
    0:17:06 And then we start going through papers and books and code.
    0:17:11 I think the first place really is to, you need to form an understanding of like what
    0:17:14 are the problems even that you can think about.
    0:17:19 I think if you’re not trained as a biologist, and even if you are, you might not see that
    0:17:25 intersection of these are the problems where biological machine learning can or cannot work.
    0:17:29 And that I think is really what the book tries to teach you, as in like, what’s the frame
    0:17:30 of thinking?
    0:17:34 What’s the lens at which you look at this world and say that, oh, that is data coming
    0:17:36 out of a microscope.
    0:17:40 I should spend 30 minutes, spin up a connet, and iterate on that.
    0:17:46 This is a really gnarly thing about how I prepare my like, you know, C.Elegant samples.
    0:17:49 I don’t think the deep learning is going to help me here.
    0:17:52 And I think it’s that blend of knowledge that the book tries to give you.
    0:17:53 It’s like a guidebook.
    0:17:57 When you see a new problem, you ask, is this a machine learning problem?
    0:17:59 If so, let me use these muscles.
    0:18:03 If it’s not a machine learning problem, well, I know that I need to talk to someone who
    0:18:05 does know these things.
    0:18:06 And that’s what we try to give.
    0:18:08 Andring has a great rule of thumb.
    0:18:12 If, you know, a human can do it in a second, deep learning can probably figure it out.
    0:18:19 So start with something like say microscopy, you have an image coming in and an expert
    0:18:22 can probably eyeball and say, interesting, not interesting.
    0:18:24 So there’s this binary choice.
    0:18:30 And there’s some arcane black box that was trained within the expert’s head and experience.
    0:18:34 That’s actually the sort of thing machine learning is like made to solve.
    0:18:38 So really ask yourself, like, when you see something like that, is there some type of
    0:18:44 perceptual input coming in, image, sound, text, and increasingly molecules, a weird
    0:18:49 new form of perception, almost magnetic or quantum, but you have perceptual input coming
    0:18:50 in.
    0:18:56 And is there a simple right, wrong, left, right, intensity type answer that you want
    0:18:57 from it?
    0:19:00 If you do, that’s really a machine learning problem at its heart.
    0:19:01 Well, so that’s one type of machine learning.
    0:19:06 And I think the benefit there of that, what human can do in a second, deep learning can
    0:19:12 do, especially since, in principle, on the cloud, you could spin up 100,000, 10,000 servers.
    0:19:15 Suddenly you’ve got 10,000 people working to solve the problem.
    0:19:17 And then they go back to something else.
    0:19:19 That’s just something you can’t do with people.
    0:19:24 Or you’ve got 10,000 people working 24/7, as necessary, can’t do that with people.
    0:19:28 But there’s another type of machine learning, which is to do things people can’t.
    0:19:32 Or maybe more specifically, do things individual people can’t, but maybe crowds could.
    0:19:37 So like we see this in radiology, right, where the machine learning can have accuracies
    0:19:41 greater than any individual, akin to what, let’s say, the consensus would be, which would
    0:19:43 be the gold standard.
    0:19:47 That’s maybe the real exciting part, sort of the so-called superhuman intelligence.
    0:19:49 Where are the boundaries of possibilities there?
    0:19:54 One of the biggest problems really with deep learning is that you have some like strange
    0:19:56 and crazy prediction.
    0:20:02 Now I think that there’s a fallacy that people fall into of trusting the machine too easily.
    0:20:07 Because 90% of the time that’s going to be garbage.
    0:20:12 And I think that really kind of the challenge of picking out these bits of superhuman insight
    0:20:16 is to know how to shave off the trash predictions.
    0:20:17 Yeah.
    0:20:19 Is 90% an exaggeration or is it really 90%?
    0:20:23 I like nice round numbers, so that might have just been something I picked out.
    0:20:26 But there’s like this great example, I think, in medicine.
    0:20:32 So there’s scans coming in and the deep learning algorithm is doing like amazing at predicting
    0:20:33 it.
    0:20:38 And then like they dug into it and it turned out that the scans came from three centers.
    0:20:43 One of them had like some type of center label that was like the trauma center or something.
    0:20:44 There’s the other non-trauma center.
    0:20:49 The deep learning algorithm had like a kindergartner told to do this, learn to identify the trauma
    0:20:52 flag and flag those and uptake those.
    0:20:57 So if you did this like naive statistics of blending them all out, you’d look amazing.
    0:20:58 But really it’s looking for a sticker.
    0:20:59 Yeah.
    0:21:00 I mean, there’s tons of examples like that.
    0:21:04 One with the pathologist with the ruler in there and it’s becoming a ruler detector and
    0:21:05 so on.
    0:21:09 Like, you know, this AUC like a sense of accuracy of close to 1.0.
    0:21:14 We all got to be very suspicious of that because just running a second experiment wouldn’t
    0:21:16 predict the first experiment with that type of accuracy.
    0:21:18 Anything that’s too good to be true probably is.
    0:21:19 Yeah.
    0:21:23 I think then you get into the really subtle challenges, which is that, you know, the algorithm
    0:21:28 tells me this molecule should be non-toxic to a human and should have effect on this,
    0:21:29 you know, indication.
    0:21:31 Do I trust it?
    0:21:34 Is it possible that there’s a false pattern learned there?
    0:21:37 Humans make these types of mistakes all the time, right?
    0:21:42 Like if you have any type of like actual biotech, you know that there’s gonna be molecules
    0:21:44 made or DCs that are disproven.
    0:21:49 So you’re getting into the hard core of learning, which is that, is this real?
    0:21:51 The reality is we don’t have answers to these.
    0:21:55 They were really kind of trending into the edge of machine learning today, which is that,
    0:21:58 is this a causal mechanism?
    0:21:59 Does A cause B?
    0:22:01 Is it a spurious correlation?
    0:22:04 And now we’re getting to that place where humans aren’t necessarily better.
    0:22:09 We talk about some techniques for interpreting, for looking at kind of what informed the decision
    0:22:11 of the deep learning algorithm.
    0:22:16 And we do provide a few kind of tips and tricks to start thinking about it, but the reality
    0:22:19 is that’s kind of the hard part of machine learning.
    0:22:20 It’s the edge.
    0:22:24 The interpreting chapter is one of my favorite ones because it’s often sort of become so-called
    0:22:29 common wisdom that machine learning is a black box, but in fact, it doesn’t have to be and
    0:22:32 there’s lots of things to do and we are quite prescriptive there.
    0:22:36 So the interpretability I think also is frankly what’s going to make human beings more at
    0:22:38 peace with this.
    0:22:40 And this isn’t anything unique to machine learning.
    0:22:46 If you had some guru who’s just spouting off stuff and said, “Buy this stock X and short
    0:22:51 the stock Y and put all your life savings into it,” you probably would be thinking, “Okay,
    0:22:54 well, maybe, but why?”
    0:22:57 So I think this is just human nature and there’s no reason why our interaction with machines
    0:22:59 would be any different.
    0:23:04 What I think is interesting is human beings are notoriously bad at causality.
    0:23:07 We kind of attribute things to be causal when they’re not causal at all.
    0:23:12 We do that in our lives from why did that person give me that cup of coffee to why did
    0:23:13 that drug fail?
    0:23:15 All these different reasons.
    0:23:17 There’s two big misconceptions about machine learning.
    0:23:18 One is lack of interpretability.
    0:23:22 The second one is correlation doesn’t mean causation, which is true, but somehow people
    0:23:26 take that to mean it’s impossible to compute causality.
    0:23:30 And that’s the part that I think people have to really be educated on because there are
    0:23:33 now numerous theories of causality.
    0:23:36 And you could use probabilistic generative models, PGMs.
    0:23:38 There’s lots of ways to go after causality.
    0:23:40 The whole trick though is you need time series data.
    0:23:43 What’s beautiful about biology or at least in healthcare is that we’ve got time series
    0:23:45 data in many cases.
    0:23:51 So now perhaps finally there’s the ability to really understand causality in a way that
    0:23:55 human beings couldn’t because we’re so bad at it and machines are good at it and we’ve
    0:23:56 got the data.
    0:24:00 Can you think of a place where in your experience the algorithms have succeeded in teasing out
    0:24:03 a causal structure that people missed?
    0:24:10 Yeah, so I think in healthcare we always think about what is leading to various changes like
    0:24:15 this drug having adverse effects, this diet having positive or negative effects.
    0:24:20 All of these things are being understood in the category of real world evidence, which
    0:24:23 is a big deal in pharma these days.
    0:24:28 And if you think about it like a clinical trial is really a poor man surrogate for not
    0:24:32 understanding causality because if we don’t understand causality you’ve got to do this
    0:24:36 thing where it’s double blind, we start from scratch and I’m following it in time and we
    0:24:37 see it.
    0:24:42 If you understood causality you might be able to just get a lot of results from just mining
    0:24:43 the data itself.
    0:24:47 As a great example you can’t do clinical trials for all pairs of drugs.
    0:24:51 I mean just doing for a single drug is ridiculously expensive and important, but all pairs of
    0:24:54 drugs would never happen, but people take pairs of drugs all the time.
    0:24:59 And so their adverse effects from real world data is probably the only way to do it and
    0:25:03 we can actually get causality and there’s tons of interesting journal medicine papers
    0:25:07 sort of saying, “Aha, we found this from doing data analyses.”
    0:25:09 I think that’s just starting out.
    0:25:14 Honestly, I think that bio-AI drug discovery needs to take a page from the self-driving
    0:25:19 car companies, in the neighboring self-driving car world, simulators are all the rage.
    0:25:25 And really because it’s that same notion of causality almost, like there’s a structure
    0:25:30 to the world like pedestrians walk out, chickens, alligators, whatever, crazy thing.
    0:25:34 I saw this for the picture, it happens.
    0:25:39 So I think there they’ve built this amazing infrastructure of being able to run these
    0:25:44 repeated experiments, almost a randomized clinical trials, but informed by real data.
    0:25:49 We don’t yet have that infrastructure in bio-world and I know there’s a couple of exciting
    0:25:53 startups are starting to kind of move towards that direction, but I think it’s when we
    0:25:58 can really probe the causality at scale and then in addition to just probing it, when
    0:26:04 the simulator is wrong, use the new data point that came in and have the simulator learn
    0:26:05 to fix itself.
    0:26:09 That’s when you get to this really amazing feedback loop that could really revolutionize
    0:26:10 biology.
    0:26:15 Yeah, so we talked about some basic nuts and bolts about how to get started and the framing
    0:26:17 of questions, which is a key part.
    0:26:22 So let’s say people, they’re set up, they’ve got their question, where do they go from
    0:26:23 there?
    0:26:27 I mean, in a sense, we’re talking about something closer to open source biology and to the extent
    0:26:33 that biology is programmable and synthetic biology is, I think, very much, it’s been around
    0:26:35 for a while, but I think it’s really starting to crest.
    0:26:39 How do these pieces come together such that we could finally get to this sort of open source
    0:26:41 biology democratization of biology?
    0:26:45 A big part of this is really the growth of community.
    0:26:49 There are people behind all these GitHub pages that you see.
    0:26:54 There’s real decentralized, powerful organizations that, if you look at the Linux Foundation,
    0:26:59 if you look at, say, the Bitcoin Core Foundation, there are networks of open source contributors
    0:27:02 really that form this brain trust.
    0:27:03 It’s very diffuse.
    0:27:08 It’s not centralized in the Stanford, Harvard, Med Department, or whatever.
    0:27:11 And I think what we’re going to see is the advent of similar decentralized brain trusts
    0:27:13 in the bio world.
    0:27:17 It’s in a network of experts who are kind of spread across the world and who kind of
    0:27:20 contribute through these code patches.
    0:27:22 And that, I think, is not at all new to the software world.
    0:27:24 We’ve seen that for decades.
    0:27:25 It’s totally new to biology.
    0:27:26 It’s alien.
    0:27:32 Like, you would be surprised how much skepticism there can be at the idea that a non-Harvard
    0:27:35 trained, say, biologist can come up with a deep insight.
    0:27:37 We know that to be a fact, right?
    0:27:43 There is multiple PhDs worth of work in just like the Linux kernel that that community
    0:27:46 really doesn’t care to get that stamp of approval.
    0:27:50 So I think we’re going to see the similar parallel kind of knowledge base that grows
    0:27:51 organically.
    0:27:55 But it takes time because you’re talking about the building of another kind of almost educational
    0:27:57 structure, which is this new and exciting direction.
    0:28:00 Here’s the challenge I worry about the most, which is that, like, so you’re building a
    0:28:05 Linux kernel, you can test whether it works or doesn’t work relatively easily.
    0:28:09 Even as it is, there’s this huge reproducibility crisis in biology.
    0:28:14 So how does one sort of become immune from that, or at least not tainted by that?
    0:28:15 How do you know what to trust?
    0:28:18 And this is a really, really interesting question.
    0:28:23 And this is kind of shading a little bit almost into the crypto world, right?
    0:28:27 Like you could potentially think about this experiment where you have like a molecule.
    0:28:31 You don’t know what’s going to happen to it, but maybe you create a prediction market
    0:28:34 that talks about the future and the small kill.
    0:28:38 And you could then begin to create these historical records of predictions.
    0:28:42 And we all know there are kind of like expert drug pickers at Big Pharma who can like eyeball
    0:28:45 and say that is going through, that is failing.
    0:28:48 And five years later, you’re like, well, shit, okay, yes, I was right.
    0:28:52 There is the beginnings of infrastructure for these feedback mechanisms, but it’s a really
    0:28:53 hard problem.
    0:28:54 Yeah.
    0:28:55 I’m trying to think though what that would be like.
    0:28:59 The huge thing is like, you could imagine if it was a simple question, like, is this
    0:29:01 drug soluble?
    0:29:03 Someone might run a cheap software calculation.
    0:29:07 Someone might do the experiment and there’s different levels of cost of having different
    0:29:09 levels of certainty.
    0:29:14 You’re essentially describing a decentralized research facility.
    0:29:16 Maybe the question is who would use it?
    0:29:21 This is, I think, the really hard part because I think that biopharma tends to be a little
    0:29:24 more risk averse for good reasons than many other industries.
    0:29:29 But I actually think that in the long run, this could be really interesting because if
    0:29:34 you have multiple assets in a company, you could kind of like, disbundle the assets and
    0:29:39 then you could start to get this much more granular understanding of like, what assets
    0:29:41 actually do well, what assets don’t.
    0:29:47 And if you make it okay for people to like, place a bet on these assets, all of a sudden
    0:29:53 it’s de-risk because if you’re a big farmer and you’re like, I don’t really believe that
    0:30:00 Alzheimer’s molecule does what is claimed, but I’m going to say like 15% odds it goes
    0:30:01 through.
    0:30:04 I’ll just invest 15% of what I would have in another world.
    0:30:07 The trick is, and especially what we’re talking about now is the world of financial instruments
    0:30:11 as well, is the trick is you have to be able to know how to risk an asset.
    0:30:15 And so it could be in the end, one of the first interesting applications of deep learning,
    0:30:20 machine learning is to use all the available data to give the maximum likelihood estimator
    0:30:22 of what we think this asset is going to be.
    0:30:25 It prices the asset and then people can go from there.
    0:30:29 It’s kind of a fun world where we’re sort of thinking about how the financial world,
    0:30:33 machine learning world and biology come together to kind of decentralize it and democratize
    0:30:34 it.
    0:30:40 I think there’s opportunities to kind of like, allow for more risks, the long tail to be played
    0:30:41 out.
    0:30:45 You don’t have as many interesting hypotheses that grow dead in the water because it wasn’t
    0:30:48 de-risk enough for a big bet.
    0:30:53 So, you know, what I think the big takeaway for me here is that there is that possible
    0:30:56 world, but I forget if this is the way you learned how to program.
    0:31:03 The way many of us did is I learned when I was like 11 on like actually a TI99A and
    0:31:07 I was just playing around with it and I learned so much because it was, I could just get my
    0:31:09 hands right in it.
    0:31:13 And I think kind of, my hope for the book is that it’s kind of the equivalent in biology
    0:31:16 that people can get their hands in it and I don’t know where they’re going to go with
    0:31:17 it.
    0:31:19 I don’t know if they go where we’re describing.
    0:31:22 That’s one of the possible, any futures, but I think that’s what we’re hopefully being
    0:31:23 able to give people.
    0:31:25 We are opening out the sandbox.
    0:31:30 Here’s what we’ve learned in kind of these very exclusive academic institutions.
    0:31:36 Let’s throw the gate open, say here’s as much as we know as we can try to distill it down
    0:31:38 and do what you will with it.
    0:31:42 Like open source means no permission, so go to town and hopefully do something good for
    0:31:44 the world is kind of the dream.
    0:31:45 That sounds fantastic.
    0:31:46 Well, thank you so much for joining us.
    0:31:47 Thank you for having me.

    with Vijay Pande (@vijaypande) and Bharath Ramsundar

    Deep learning has arrived in the life sciences: every week, it seems, a new published study comes out… with code on top. In this episode, a16z General Partner Vijay Pande and Bharath Ramsundar talk about how AI/ML is unlocking the field in a new way, in a conversation around their book, Deep Learning for the Life Sciences: Applying Deep Learning to Genomics, Microscopy, Drug Discovery, and More (also co-authored with Peter Eastman and Patrick Walters.

    So — why now? ML is old, bio is certainly old. What is it about deep learning’s evolution that is allowing it to finally making a major impact in the life sciences? What is the practical toolkit you need, the right kinds of problems to attack, and the right questions to ask? How is the hacker ethos coming to the world of biology? And what might “open source biology” look like in the future?

  • 338: What I’ve Learned and Applied from 49 Awesome Entrepreneurs – Part 6

    At the end of nearly every episode of The Side Hustle Show, I ask my guests for their #1 tip for Side Hustle Nation. There’s always a great variety of responses, and I wanted to take some time today to go through some of my favorites from the past 50-ish interviews.

    This has become an annual tradition on the show, and we just passed 6 years and 8.5 million downloads!

    If you like this short-and-sweet meta-style show, be sure to check out the others in this series:

    And even though my primary motivation is to extract helpful tactics for you, the listener, I can’t help but learn from my guests as well. You never know when inspiration will strike or where you’ll hear the one insight that has a huge impact.

    These episodes are a lot of fun to put together, and give me an excuse to revisit some of my favorite moments and wise words from the show.

    From the last 49 guests, the 3 most common theme I heard was to “Take action. Just start!”

    While it might sound overly generic, don’t be quick to dismiss it. If all these really smart and successful people keep saying these episode after episode as their “#1 tip,” I think it’s worth paying attention to.

    My #1 tip this time? Surround yourself with people on the same trajectory.

    Full Show Notes: What I’ve Learned and Applied from 49 Awesome Entrepreneurs – Part 6

  • 380. Notes From an Imperfect Paradise

    Recorded live in Los Angeles. Guests include Mayor Eric Garcetti, the “Earthquake Lady,” the head of the Port of L.A., and a scientist with NASA’s Planetary Protection team. With co-host Angela Duckworth, fact-checker Mike Maughan, and the worldwide debut of Luis Guerra and the Freakonomics Radio Orchestra.

  • a16z Podcast: The Power of Restorative Justice

    AI transcript
    0:00:04 I’m Chris Lyons, and I lead the Cultural Leadership Fund here at Andreessen Horowitz,
    0:00:08 a strategic investment vehicle that connects the world’s greatest cultural leaders to
    0:00:10 the best new technology companies.
    0:00:15 This segment of the A16Z podcast was based on an event hosted by the CLF in which we
    0:00:20 featured a special early screening of Van Jones’s new series, The Redemption Project, followed
    0:00:25 by a fireside chat between Van Jones and Chaka Senghor.
    0:00:29 The Redemption Project is an eight-part series that looks at victims’ families in a life-altering
    0:00:34 crime as they come together to actually meet their offender in hopes of finding personal
    0:00:35 healing or peace.
    0:00:40 It’s a rare glimpse into the U.S. prison system and also the incredible human potential
    0:00:43 for redemption through restorative justice.
    0:00:47 In this episode, Jones brought together a police officer who was shot and the man who
    0:00:51 committed the crime decades earlier when he was only 17 years old.
    0:00:54 In addition to the conversation between Van and Chaka, you’ll also hear two spoken
    0:00:56 word performances.
    0:01:00 Both artists are formally incarcerated inmates who have contributed to The Beat Within, an
    0:01:05 organization and publication that serves over 5,000 youth annually through workshops
    0:01:11 operated across California County juvenile halls and encourages literacy, self-expression,
    0:01:15 healthy and supportive relationships with adults from their community.
    0:01:19 First off, we’ll open up with Kevin Gentry performing his piece, My Heart.
    0:01:24 And please note, there is some profanity and mature material in this episode.
    0:01:32 For all intents and purposes, this piece, I loosely call it a piece, it’s more a letter
    0:01:38 and the recipients of which are going to become readily apparent as I read this.
    0:01:43 Excuse me, I’m sorry, may I please have just a few minutes of your time to say how much
    0:01:47 I’m sorry for destroying your life.
    0:01:51 Strong words that fall so short I can only imagine.
    0:01:56 How can I, especially I, even begin to measure the impact of what I’ve done.
    0:02:02 The loss, the pain, the emptiness, the sorrow, the guilt, what ifs, if onlys.
    0:02:03 Is that a good start?
    0:02:05 Maybe, I don’t know.
    0:02:11 For so long I have dreamt of just how, what to say, the right words, but everything just
    0:02:13 feels so flat.
    0:02:18 So now here I am, resigned to having faith in the process, releasing my heart to you
    0:02:23 through the words, praying that they will do, sparing even the slightest amount of any
    0:02:25 additional hurt.
    0:02:31 In no way did you deserve these years of torment, the anguish, the pain, the emptiness, perhaps
    0:02:37 even bearing the burden of having to be strong for others when support was the furthest thing
    0:02:39 from your mind.
    0:02:43 You didn’t deserve such a fate, I’m sorry.
    0:02:48 Sorry that on that faithful day I largely treated others like I felt.
    0:02:55 Empty and devoid of any value, I saw your loved one as an object, though human, an obstacle
    0:02:57 to my hopes and dreams.
    0:03:03 Hopes and dreams are belonging and feeling relevant in the eyes of others, relevant so
    0:03:10 unattainable it seemed for so long, so empty such a void I felt barren to the core.
    0:03:14 My attempts to self heal I thought while I was perfecting.
    0:03:21 If I get more I’ll be more, value was in the end more, irrelevance was in the knot.
    0:03:29 In genius I believed back then, feel bad, fill with stuff, feel good, but not for long.
    0:03:32 Try again, something’s wrong.
    0:03:40 The pattern I repeated a revolving door in my life, try to feel, feeling full, just temporary,
    0:03:43 once again feeling empty setting in.
    0:03:50 The writing expectation that a life, his life, our lives should be unrestrained and unimpeded
    0:03:55 by the untrue self defeating and outwardly destructive thoughts and behavior of someone
    0:03:57 just as me.
    0:04:04 To stand in the way with an idea, a belief in some time, to cowardly step with hollow purpose
    0:04:07 to fill a void that was never real.
    0:04:13 Your loved ones so deserving of everything good, unaffected by me, unfortunately there
    0:04:15 wasn’t me.
    0:04:20 But thank God there is also you through which his life still lives.
    0:04:26 Through the memories and lessons in love, the affection and joy and promise and hope
    0:04:31 and countless other memories I’m sure, though I cut them way too short.
    0:04:37 Now illuminated to the precious sanctity of life, the gift of the beauty and purpose that
    0:04:44 lies within us all, staying ever mindful that I will never grasp the gravity of the destruction
    0:04:46 I caused you that day.
    0:04:55 I stay primed and fueled to walk boldly, purposefully, into any and every venue to answer my call.
    0:05:00 To carry his memory in my heart to others with a message of life, of promise even on
    0:05:03 the lowest rung to all.
    0:05:09 Hope is eternal, believe it, a bright future can spring from even the darkest past.
    0:05:16 The words that I now utter, I do so to breathe life into those who may feel that they have
    0:05:24 guest but last.
    0:05:26 And now we’ll hear from Van Jones and Shaka Sengor.
    0:05:31 Shaka was most recently the executive director of the Anti-Rocidicism Coalition, a New York
    0:05:36 Times bestselling author for his memoir Writing My Wrongs, Life, Death and Redemption in an
    0:05:41 American prison and star of the highly anticipated One Man Show.
    0:05:46 Van Jones is an American news commentator, author, co-founder of several non-profit organizations
    0:05:52 including Reform and Yes We Code, host of The Van Jones Show and co-host of CNN’s political
    0:05:54 debate show Crossfire.
    0:05:58 Their conversation is all about the redemption project, the American prison system and how
    0:06:03 we can normalize rehabilitation and restorative justice in our culture.
    0:06:09 The journey toward redemption is one I understand on a very personal level.
    0:06:14 And you and I, we’ve been friends for a while and we’ve had a chance to talk about, you
    0:06:18 know, what does redemption look like for people?
    0:06:24 What is something that you would say really stood out to you as a lesson that we can all
    0:06:30 take away to create space for redemption to happen?
    0:06:35 Doing this whole series has changed me in ways I haven’t really caught up to yet.
    0:06:42 You know, now when I’m on TV and we’re supposed to be tearing each other up over some tweet
    0:06:49 or some other nonsense that’s going on, which is terrible stuff, but I have a hard time
    0:06:58 getting as petty and shitty as you have to be to do good television.
    0:07:04 And then jeopardizing my career, I have to figure out some way to get petty again.
    0:07:08 I have some answers for you.
    0:07:13 That one was the hardest one for me to do because my dad used to be a cop.
    0:07:20 And my uncle Milton just retired from Memphis City Police Force a couple of years ago.
    0:07:25 And so that one was hard for me as much as I do criminal justice stuff and as much as
    0:07:29 I’ve like, you know, been against police brutality.
    0:07:33 That’s always your fear when you have a family member who’s a cop.
    0:07:41 And you can see me struggling in this episode to be my usual sort of like open self.
    0:07:42 Like I was really tight.
    0:07:47 You know, I was really trying, but I wasn’t succeeding in this episode.
    0:07:52 And I told Jason, I said, I don’t think this is going to go well.
    0:07:58 Tom has admitted that he’s got racial bias, which was a big deal.
    0:08:01 You know, this is not going to go well.
    0:08:04 This is going to be a shit show.
    0:08:08 And I guess one has to go terribly, like that was basically my view.
    0:08:13 And so I didn’t have any hope in that one.
    0:08:17 I was just waiting for him to come out and, you know, say some stuff that wasn’t going
    0:08:18 to work.
    0:08:26 And as soon as the door opened, just something changed, both of them became something different
    0:08:30 than they had been up until the moment they saw each other.
    0:08:36 Something fell away and, you know, between men, there’s almost always some shielding
    0:08:43 in a patriarchal society, like you’ll tell a woman you just met more than you’ll tell
    0:08:46 your homeboy, you know, for 20 years about how you actually feel.
    0:08:49 You know, it’s just the trap.
    0:08:54 And between white and black people, there’s always a lot of gulf.
    0:09:03 And between cops and black people, it’s like planetary levels of gulf and all just disappeared.
    0:09:08 And you saw these two guys who had literally tried to kill each other, laugh at each other,
    0:09:14 saw each other, have this conversation that I bet they couldn’t have with any other human
    0:09:16 being.
    0:09:18 And I haven’t processed it.
    0:09:22 And there’s a lot of stuff in this series I haven’t processed.
    0:09:23 Yeah, I can imagine.
    0:09:26 I struggle with this episode.
    0:09:31 You know, I’ve watched a few episodes, I’ve actually struggled with all of them.
    0:09:38 And you know, for those who may not know my story, I was convicted of second degree homicide.
    0:09:45 And while I was in prison, I got into an altercation and I punched the officer in the neck and
    0:09:47 almost killed him.
    0:09:54 The family of the man whose life I’m responsible for taking, one of them reached out to me
    0:09:58 and extended a letter of forgiveness during my incarceration.
    0:10:04 The officer that I got into the conflict with in prison advocated for me to die in solitary
    0:10:05 confinement.
    0:10:12 And so as I’ve done this work over the years, that’s one of the areas of my life I haven’t
    0:10:14 been able to reconcile.
    0:10:21 So watching Jason come out and seeing that through the lens of his 17-year-old self,
    0:10:26 and knowing where he was back then, and knowing that I was him back then.
    0:10:33 And I’m thinking about this larger conversation that this is presenting to the world about
    0:10:35 how do we see what’s possible.
    0:10:41 You know, I’ve been out of prison almost nine years now, I’ve been highly successful
    0:10:46 and been able to do a lot of work in this space and prevent acts of violence in communities
    0:10:48 throughout the country.
    0:10:56 But the reality is, for many men like Jason, like myself, society just says, “Watch our
    0:10:57 hands of them.
    0:10:58 They’re broken.
    0:10:59 They’re beyond repair.
    0:11:00 Throw them away.
    0:11:02 Let them die in prison.”
    0:11:08 And one of the things that really struck me was that restorative justice gives space for
    0:11:13 people who have been hurt by the Jason’s of the world to have their say.
    0:11:17 And we saw what happens when you create space for that.
    0:11:22 You know, Tom’s a remarkable man, Christie is an extraordinary woman.
    0:11:27 And the courage that the exhibit was honest, you know, she went from, you know, “I want
    0:11:35 them to die in prison because we can’t kill them because of a particular crime to forgiveness.”
    0:11:41 And so as we think about this show, how do we amplify that part of the message?
    0:11:46 How do we get people to understand that people do change in a very real way?
    0:11:54 Well, look, I mean, part of what’s crazy about this show is that it exists at all.
    0:12:00 You know, CNN has put this at nine o’clock on Sundays, which is prime time.
    0:12:03 And that’s Anthony Bourdain’s slot.
    0:12:06 Against getting my thrones now.
    0:12:13 So they either really like it or they really don’t.
    0:12:23 Our idea was we wanted to do media that would be healing, that would be positive, that would
    0:12:29 be transformative and, you know, living in Hollywood and all that, you know, you get
    0:12:35 a lot of side-eye looks at you when you talk that way, as you know, until you actually
    0:12:40 can produce something that makes the point, you’re just one of those people talking in
    0:12:45 the cafe that everybody rolls their eyes at, which is half the population of LA.
    0:12:51 Luckily, Jana’s best friend from college, Antonia, married to a guy named Jason Cohen.
    0:12:58 Jason Cohen is the guy that did “Facing Fear,” that Oscar-nominated film about a former U.S.
    0:13:02 neo-Nazi who reconciled with his victim of violence.
    0:13:08 So Jason, having done that film, said, “Hey, let’s do this.
    0:13:10 Let’s do this kind of a series.”
    0:13:15 So we just went totally renegade, you know, seeing in I’m not allowed to do anything without
    0:13:19 their permission on camera, but we just went totally renegade, shot something.
    0:13:21 It wasn’t a good idea.
    0:13:23 Let me stop you there, right?
    0:13:30 So you basically what you’re saying is that you are willing to compromise your career.
    0:13:34 You’re standing something you’ve worked long and hard for.
    0:13:38 Most people would, you know, who talk a lot, especially people on social media, they would
    0:13:43 love to be on CNN sharing their opinions and views and thoughts.
    0:13:51 And you were willing to sacrifice or compromise that because you felt so strongly about the
    0:13:53 importance of this mission.
    0:13:59 Yeah, but yeah, because who gives a shit if we’re going to just be up here, I mean, you’re
    0:14:00 the same way.
    0:14:03 I mean, you could, people in this room are the same way.
    0:14:04 Look.
    0:14:06 I might not quit my job.
    0:14:08 And you’re about to quit.
    0:14:11 I got a seven-year-old.
    0:14:14 But honestly, like that’s how we got the messy truth on the air.
    0:14:18 I think it’s a very important point is that we have to take chances.
    0:14:23 I mean, for me, I felt like this is the moment.
    0:14:28 I feel like criminal justice reform is finally becoming a mainstream conversation.
    0:14:33 The problem that we have right now is that there’s a level that people won’t go to.
    0:14:36 So we can have the conversation about innocence, right?
    0:14:40 And that’s an important conversation because that begins to chip away at people’s confidence
    0:14:43 in the system that innocent people are being put away in prison.
    0:14:48 So that used to be risky to say that our system, our American system, is putting people to
    0:14:49 death who are innocent.
    0:14:50 That was radical.
    0:14:52 But we’ve been able to establish that.
    0:14:55 Then we went to the nonviolent drug offenders.
    0:14:57 They’re guilty, but they’re guilty of stuff that you did in college.
    0:14:59 So why are they in prison?
    0:15:00 Or maybe you did this weekend.
    0:15:05 So don’t raise your hand.
    0:15:08 And so now that’s been established.
    0:15:13 But then the way that the danger is that then, well, okay, but if you’re not innocent and
    0:15:17 if you’re not nonviolent, well, then we really don’t have to care about you at all.
    0:15:22 And we have all these funerals in the community and we have all this harm and we can’t talk
    0:15:23 about it.
    0:15:30 And I said, this true crime genre has to be hacked and used for something positive because
    0:15:34 true crime on the left wing, it’s about exoneration.
    0:15:35 Like who done it?
    0:15:38 Well, we’ve got to exonerate the person because they’re actually innocent.
    0:15:41 Or on the right wing, it’s catch a killer.
    0:15:45 But true crime as a who done it genre doesn’t get to the truth because a lot of times we
    0:15:47 know who did it.
    0:15:48 We already know who did it.
    0:15:53 It’s about the truth long after the crime, which is that growth is possible for people
    0:15:55 who have done harm.
    0:16:00 And healing is sometimes impossible for people who’ve been harmed because of separation.
    0:16:04 Because we don’t let people actually eventually come back together.
    0:16:06 And so I say it was worth the risk.
    0:16:07 And so we did it.
    0:16:08 It was a little bit nuts.
    0:16:10 We showed it to CNN.
    0:16:15 Look, that day when we did the first one, I literally, I cried so hard when it was over
    0:16:17 that my nose started bleeding.
    0:16:21 Like, because my blood pressure was so high, it was just such an intense thing to see a
    0:16:24 man who would kill someone’s mother sit down with the daughter 20 years later and try to
    0:16:26 explain.
    0:16:29 And we showed that to CNN.
    0:16:35 And at that point, you know, we had no other people to go talk to.
    0:16:38 It wasn’t like there’s thousands of people for us to go talk to, but CNN said if you
    0:16:40 can find more, shoot it.
    0:16:41 So we shot it.
    0:16:42 Why am I saying all this?
    0:16:47 I’m saying this to say that from my point of view, we’re at a point where those of us
    0:16:53 who have privilege earned or otherwise, those of us who have positions of power, those of
    0:16:59 us who have positions where, you know, people have to listen to what we say.
    0:17:00 We have to push.
    0:17:03 The Phadra Ellis Lampkins is here.
    0:17:08 And she’s an African-American entrepreneur in the tech space, female.
    0:17:13 You know, they say, like, that’s like a plaid unicorn or something, like, you know, it’s
    0:17:16 not even supposed to exist in fantasy land.
    0:17:22 And yet she’s building a company called Promise, pushing technology to solve some of these problems
    0:17:25 in the community and winning, right?
    0:17:27 You know, she doesn’t have to do that.
    0:17:31 She could have taken an easy job and not try or put together a company to, like, you know,
    0:17:36 make, you know, I don’t know, pictures or something, I don’t know.
    0:17:40 But she’s doing the hard thing the hard way for the right reasons.
    0:17:43 So all I’m saying is this.
    0:17:47 The culture is not a show about criminal justice, first of all.
    0:17:51 We have to market it that way and promote it that way, but it’s not about that.
    0:17:52 It’s about humanity.
    0:17:58 All of us have done something that we profoundly regret and don’t have any way to apologize
    0:17:59 for.
    0:18:03 All of us have had something done to us that’s hard to get past, and the stakes are higher
    0:18:06 in our show, but this is humanity.
    0:18:08 This is the human condition.
    0:18:13 And yet in our culture, empathy is no longer trendy.
    0:18:15 Compassion is no longer trendy.
    0:18:19 It’s about the cancel culture, the call-out culture, and it’s poison.
    0:18:21 This is the human condition.
    0:18:26 We have to be able to listen to each other, to forgive each other, to hold each other,
    0:18:27 to help each other.
    0:18:28 That’s not fashionable.
    0:18:31 And so we want to put some medicine back in the culture.
    0:18:35 This show is our attempt to put some medicine back in the culture, and a little bit of medicine
    0:18:37 can go a long way.
    0:18:42 And so, you know, that’s what we’re trying to do.
    0:18:43 [APPLAUSE]
    0:18:47 I really want to push the envelope a little bit.
    0:18:48 Eight episodes.
    0:18:49 Yes, sir.
    0:18:52 One of the episodes of “The Aviation Restorative Justice” happening, right?
    0:18:55 In small pockets throughout the country, some prisons are a lot more progressive with
    0:18:58 creating space for that.
    0:19:02 But the reality is, it doesn’t happen for everybody.
    0:19:07 So a lot of men, I work with men and women every day to come home from prison.
    0:19:14 As executive director, anti-recidivism coalition, our staff has comprised 54% of system-impacted
    0:19:16 men to come out of prison.
    0:19:19 A lot of them have armed robberies.
    0:19:20 Homicides.
    0:19:21 Attempted murder.
    0:19:26 I have scores of friends who are coming home after the “Warm Drugs” campaign.
    0:19:33 Thousands of men and women come home every day who have served 15, 20, 30 years in prison.
    0:19:38 They haven’t gone through a restorative justice process, because for years, our prison system
    0:19:41 was designed for nothing more than punishment.
    0:19:44 And that’s somebody who was deeply immersed in that environment.
    0:19:46 And I know the type of works it takes to get there, right?
    0:19:50 I know what it takes to transform a life.
    0:19:54 I can honestly say I was super blessed and fortunate, because I was actually literate
    0:19:55 when I’m with the prison.
    0:19:57 And so I was able to read books that inspired me.
    0:20:03 I was able to read Malcolm and read Mandela and read books about personal transformation
    0:20:05 in these things, right?
    0:20:06 And then I put the work in.
    0:20:09 That’s not the norm in prison.
    0:20:12 This is not the norm in prisons throughout the country.
    0:20:21 And so one of the things that I’m always thoughtful about is how do we normalize restorative justice?
    0:20:24 How do we normalize redemption?
    0:20:26 You watch somebody in their worst moment.
    0:20:30 It’s one of the things that I love that Sheriff Tom spoke about is that he met him in his
    0:20:31 worst moment.
    0:20:33 He met Jason in his worst moment, right?
    0:20:36 But he was also in his worst moment.
    0:20:40 And now we have many men and women coming home, and I deal with them all the time, and they’re
    0:20:44 broken and they haven’t been able to make peace.
    0:20:48 We think about the victim and them working through their trauma.
    0:20:52 But there’s also work that those of us who have perpetrated a violent crime have to do
    0:20:55 on our own.
    0:21:00 And when I say on our own, oftentimes on our own, because in most cases we’re scary.
    0:21:02 People are afraid.
    0:21:04 You’ve killed another human being.
    0:21:09 I don’t know if I can trust when you’re upset or when you’re angry or when things aren’t
    0:21:13 going your way that you won’t react in that manner again.
    0:21:20 So how do we create a space where there’s more honesty about what’s really not working?
    0:21:23 We know about the policies and things like that, right?
    0:21:29 But once the policies work, there’s real human beings coming home with deep, deep trauma.
    0:21:35 My first 10 years in prison, I was in solitary my second year, and I ended up in solitary
    0:21:38 my seventh year that extended to my 11th year.
    0:21:43 So I did a total of seven years in hell, and I’m fortunate to have that breakthrough.
    0:21:50 But what about the men and women who don’t have space to reconcile their past?
    0:21:52 And what is our responsibility?
    0:21:57 Ultimately, I guess the question is, what is our societal responsibility when it comes
    0:22:02 to welcoming those men and women home in a healthy way?
    0:22:06 I think this is the key question for American society.
    0:22:10 I don’t have an answer, but it’s the key question.
    0:22:11 We have…
    0:22:17 People has become almost numb to throw out the numbers, but we have the biggest incarceration
    0:22:23 industry in the world here in the United States, trafficking in human flesh, trafficking in
    0:22:25 human bodies.
    0:22:28 On the stock exchange, you have private prison companies that get more money than more people
    0:22:32 who are locked up, and there’s no business model in de-incarceration.
    0:22:35 The business model is in incarceration.
    0:22:40 But what I do know is this, this is a political problem, kind of.
    0:22:42 It’s a policy problem, kind of.
    0:22:45 It’s an economic problem, kind of.
    0:22:48 It’s a spiritual problem, for sure.
    0:22:49 Absolutely.
    0:22:55 It’s a spiritual problem, and separation is the enemy.
    0:22:56 That’s the problem.
    0:23:02 And unfortunately, you have now both political parties preaching separation and superiority.
    0:23:08 Those red state people, those big it’s those idiots, those Trump voters, they’re terrible.
    0:23:13 It’s almost like we in the blue, we’re good, they’re bad.
    0:23:15 And it’s almost like a colonial thing.
    0:23:22 Like the people in the red state, these unwashed heathens that need to be conquered and converted
    0:23:31 to the NPR religion, and force fed some kale until they can rise up to our level of
    0:23:32 civilization.
    0:23:34 I mean, this is how people talk.
    0:23:38 Separation and superiority, and then, of course, you know how the other side does.
    0:23:40 And so, for me, it’s a spiritual problem.
    0:23:42 Separation is the enemy.
    0:23:49 And so, I have discovered all these diamonds behind those prison walls.
    0:23:50 Absolutely.
    0:23:52 No pressure, no diamonds.
    0:23:53 There are diamonds behind those walls.
    0:24:00 There are people behind those walls that are much wiser, much braver, much stronger, much
    0:24:06 more creative than 99.99% of people who are on the outside.
    0:24:12 When I worked in the Obama White House on a Friday, I was at San Quentin doing my work.
    0:24:17 And then on Monday, I was in the Obama White House reporting for work.
    0:24:20 So I went from the jailhouse to the White House in 72 hours.
    0:24:26 And even under the Obama administration, the smartest people in the Obama administration
    0:24:29 were no smarter than the smartest people at San Quentin.
    0:24:35 But the wisest people at San Quentin were wiser than anybody in Washington, D.C.
    0:24:39 All I know is that I have to tell the truth as I see it.
    0:24:40 Absolutely.
    0:24:45 And part of it is, you know, telling people, “Look, I went to Yale Law School.
    0:24:50 I saw more kids doing drugs at Yale than I ever saw doing drugs and housing projects.”
    0:24:51 Period.
    0:24:54 And none of those kids even saw a police office.
    0:24:58 They got in trouble when they went to rehab or France.
    0:25:03 They sure didn’t go to prison.
    0:25:08 And yet four or five blocks away, those kids, you know, doing fewer drugs because they had
    0:25:12 less money and selling fewer drugs because they were dealing with a different clientele,
    0:25:15 they almost all at least got arrested if they didn’t go to prison.
    0:25:19 And yet now we sit here and say, “Well, I can’t, my God, I can’t hire you.
    0:25:20 You’re a drug felon.”
    0:25:24 You know what I mean?
    0:25:30 So the hypocrisy of a society where almost everybody’s addicted to something and nobody
    0:25:35 can survive, think about this, these phones we carry around, if I told you right now that
    0:25:41 for the past three months we have been audio taping and video taping, everything you’ve
    0:25:49 been doing, and we’re now about to show it on the screen, you would run out of here because
    0:25:52 none of us are as good all the time as we’re supposed to be.
    0:25:53 Absolutely.
    0:25:57 And nobody wants to be defined by their worst moment or their worst mistake, as you said
    0:25:58 many times.
    0:26:03 And so for me, I don’t know, but I do know that everybody in here has a lot of power
    0:26:04 in the matter.
    0:26:09 And everybody in here has a lot of ability to turn it.
    0:26:12 And I think it’s trying to happen.
    0:26:17 I think the fact that this many people are here, the fact that CNN put this up, I think
    0:26:19 it’s trying to happen.
    0:26:25 You know, your voice, Topeka Sam’s voice, Lewis Reed’s voice, the voice of people who
    0:26:30 are directly impacted, people who are coming out of prison, you’re right, everybody doesn’t
    0:26:33 come out of prison as whole as you.
    0:26:36 Everybody doesn’t come out of prison and have Oprah as their best friend.
    0:26:40 In fact, most people who haven’t gone to prison don’t have those things.
    0:26:44 So art as a tool to shift culture.
    0:26:52 How important is art and technology towards shifting this larger idea culturally?
    0:26:58 You know, the opposite of humanization is criminalization.
    0:27:01 If you can criminalize a whole population of people, all everybody in the neighborhood
    0:27:05 is bad, all the people from that racial group are bad.
    0:27:10 If you can criminalize a whole population, then you dehumanize them and then anything
    0:27:14 can be done and people won’t respond to it if it’s my child.
    0:27:20 Everybody says, oh my God, my child’s on drugs, give him 17 years in prison.
    0:27:21 Nobody says that.
    0:27:22 People say my child needs help.
    0:27:28 And so what I would say is that the opposite of criminalization, though, is humanization.
    0:27:32 And so art and technology, which helps us to humanize and spread these stories is really
    0:27:33 critical.
    0:27:34 Thank you.
    0:27:40 Hey, Shalucha, we love y’all.
    0:27:41 All right.
    0:27:46 A round of applause for Shalucha and Van.
    0:27:51 Now we’ll enjoy a performance by Missy Hart, who will share an amazingly powerful piece
    0:27:55 called “Bloom,” a trilogy, and the titles of the three different poems are “Just Us,
    0:28:02 The Dream,” and “What’s Your Seed?”
    0:28:06 Before I share these pieces, I want to share a big part of myself.
    0:28:09 And I feel it’s really important to really paint a picture of the power of healing and
    0:28:11 redemption, and creative art therapies.
    0:28:15 I’m from Norfolk, Redwood City, California, it’s not too far from here.
    0:28:18 My beautiful struggle began when my father committed suicide before I was two.
    0:28:21 So I was raised by my strong single mother, who had to work multiple jobs.
    0:28:26 She came up out of the gang culture as well, and not just working jobs, but taking care
    0:28:27 of my grandmother, who was mentally ill.
    0:28:29 But most of the time it was me and my brother taking care of her.
    0:28:31 So I had to grow up really fast.
    0:28:37 And during that time, growing up in the streets and trying to find my identity, we all go
    0:28:41 through those times trying to find our identity, and being biracial, and a lesbian growing up
    0:28:45 in the late ’90s, early 2000s, I tried to find my place, you know, and I found my place
    0:28:46 in the streets.
    0:28:50 And I started gang banging when I was 10, and being a girl smaller than everyone else,
    0:28:51 I had to go hard.
    0:28:55 In the streets, you were all in, or you’re not, you’re not going to survive.
    0:28:59 So I was fully committed, went all in, caught my first case when I was 11, when they just
    0:29:03 passed Prop 21, and then I went to the system.
    0:29:06 When I was 13, I started writing for the B. And the B really gave me a voice, gave me
    0:29:09 a way to express my truths in my way.
    0:29:13 Because going through the system, you’re constantly trying to go through all these therapies
    0:29:17 and stuff, but you don’t even have language growing up and not being shown what you’re
    0:29:21 feeling, or you just learn to speak with the language of violence and aggression.
    0:29:24 And that’s what I learned to speak.
    0:29:28 So over the next few years, I was in and out of the system, I became a ward of the court,
    0:29:32 so I was in group homes, being locked up, and then being on the run, and then just in
    0:29:33 this constant cycle.
    0:29:38 And it wasn’t until I got released two months before my 18th birthday, and my mom’s boy
    0:29:41 didn’t want me at the house, so I was homeless, serving crack on Army Block.
    0:29:46 I don’t know if y’all from the city, but I on the blade on 2/6, and then I started changing
    0:29:47 my life.
    0:29:50 And then when I caught a tent of murder charges day before my 18th birthday, I fought the
    0:29:52 whole case in solitary.
    0:29:57 But by the grace of God, I was taken and arrested when I did, because where I lived that back
    0:30:00 home, my boy ended up stabbing his due to death not even an hour later.
    0:30:03 So if I didn’t get arrested when I did, I would be in there for murder.
    0:30:05 He’s doing 25 with L right now.
    0:30:09 And I just really, you know, just started to see that my chances were running out.
    0:30:12 And I got out, and you don’t change overnight, it’s a process, you know, and putting that
    0:30:13 work in.
    0:30:16 But it’s so important.
    0:30:19 And I got out, you know, in and out of county, but then, you know, I started to change my
    0:30:23 life and really see that education was the way to liberate myself.
    0:30:26 So I went back to adult school, got my high school diploma.
    0:30:30 And then I went to community college, when Jason was saying, like, you know, just having
    0:30:32 someone believe in you, that is so powerful.
    0:30:35 It may seem so little, but just even in times when you don’t believe in yourself and you’re
    0:30:39 just raised to be taught in a system like broken down, you’re identity to nothing, you
    0:30:42 know, and the beat really gave us our voice, and the beat really planted that seed for me
    0:30:46 because now I’m doing, like, all these amazing things that I can’t even imagine back then.
    0:30:50 So I went to community college, ended up winning a full ride to UC Santa Cruz where I attend
    0:30:52 now I’m studying psychology and the history of consciousness.
    0:30:53 Thank you.
    0:30:54 Right on.
    0:31:03 And I also just want a national scholarship to go study abroad this fall where I’m going
    0:31:07 to study psychology, neuroscience, come back, go to DC, do an internship, come back, and
    0:31:11 then I’m planning to get my PhD in positive psychology and my end goal.
    0:31:12 Thank you.
    0:31:16 Thank you, thank you.
    0:31:19 My end goal is to open a group home with an art therapy program because our creative
    0:31:23 art therapy is like, it’s so powerful, I can’t stress it enough, like, there’s no words
    0:31:28 that I can even put it, express to explain how powerful it is.
    0:31:31 So yeah, without further ado, I’m just going to spray my pieces then.
    0:31:35 So thank you, and I just want to say thank you because this is a privilege of me being
    0:31:39 on the stage because a lot of my loved ones and people I don’t even know, you know, we
    0:31:43 lost to the streets and the system, they don’t get the same opportunity, and I’m just thankful
    0:31:46 and I don’t just do this for myself, but I do this for my people and everyone’s still
    0:31:50 behind those walls and who are lost, you know, and I’m just, I got this motto, it’s called
    0:31:51 be the change, lead the way.
    0:31:57 Alright, so this is a bloom, so this first one’s called Just Us, a little spin of justice.
    0:32:00 Is it just us who see no justice and no peace struggling to achieve the American dream?
    0:32:05 A dream just to have an opportunity to succeed, but somehow it’s so far to reach.
    0:32:08 Car in the system that’s designed to keep us at war, at war with each other, a storm
    0:32:10 is brewing right outside of your door.
    0:32:14 Is it our choice to endure, or is there a power much greater than the plans of the hate-filled
    0:32:17 hearts, wedging a war on the people and the power within, a power capable in manifesting
    0:32:22 a revolutionary change, a change that ripples through generations in time, seeded in this
    0:32:24 message trying to reach your mind through a rhyme, because you see the powers in the
    0:32:28 people and the passion that’s in our hearts, but change only start when you shine the light
    0:32:32 on the dark, beginning within ourselves and branching out to the people, educating each
    0:32:36 other to fight for our rights to be equal, no one who walks upon this earth is illegal,
    0:32:39 all this misguided hate and bigotry is spiritually lethal, empowerment for each other starts
    0:32:44 with the peaceful, not the deceitful, don’t let the con steer you wrong, the power you
    0:32:48 hold within remains strong, you just gotta believe in the power of your seed to plant
    0:32:52 amongst the weeds of the world’s evil deeds, to grow strong like a tree to feed the minds
    0:32:56 of the future, but first you must take your time to find your design that creates change
    0:33:00 in people’s lives one day at a time, then you will see it begins within thee, so may
    0:33:04 the life you lead be the life of the sea, be the change lead the way and ask yourself,
    0:33:06 what can I do today?
    0:33:17 Thank you, thank you, so this next one is called the dream, many underestimate the power
    0:33:21 of the mind, but what really lies inside the complexity of the emotional pathways that
    0:33:27 lead us to act in a certain way, what drives us to manifest positive change, is it love,
    0:33:31 is it pain, maybe it’s the dream that we all dare to scheme, this dream to be free and
    0:33:35 all live in peace, but it seems just to be that, but a dream, a dream that seems impossible
    0:33:37 to conceive or is it?
    0:33:41 We create our limitations gate, it’s the power of your mind that can grow with time or deplete
    0:33:46 with lies depending on what vibe you choose to feed inside, it begins with the light that
    0:33:49 burns deep and bright, the young activists that just wants to raise their fists to fight
    0:33:53 for the people in and out of sight because you see it’s not just about you or me, but
    0:33:57 we, together we can be this dream that we dream, but in order to achieve this dream we
    0:34:01 all need to see that I am you and you are me, that beating in your chest is your first
    0:34:04 clue, purpose, you feel that?
    0:34:07 That’s what we need to remember when faced with the choice to endeavor, it’s the power
    0:34:10 of your mind that leads you to believe that you can achieve all that you seek, it just
    0:34:14 leaves one question, what’s your seed?
    0:34:22 Thank you, and you know to like kind of answer the question, like if you don’t got no purpose
    0:34:24 when you get out, you’re going to end up right back in because you got nothing that’s going
    0:34:27 to bring you up out of that, so I just want to say it because that’s whatever, everyone
    0:34:30 has a seed, everyone has a seed to plant, you got to believe in that seed and believe in
    0:34:31 yourself.
    0:34:36 So this last one is called what’s your seed, I actually wrote this one when I started changing
    0:34:40 my life when I went to community college, I wrote it before the other two, so it means
    0:34:42 a lot to me, so.
    0:34:45 Like a scientist gone mad, creativity flows out of me, like knowledge to history, like
    0:34:49 wise words to a revolutionary, like the power in the people, but nobody is listening, while
    0:34:53 the falling clock of life just keeps on ticking, life hitting you with trials and tribulations
    0:34:57 and man you still don’t get it, you don’t need to wake the fuck up and get on with it,
    0:35:00 if not when judgment day comes don’t look at me to save you because I wasn’t the one,
    0:35:03 while you’re gone and with your funds, steady stacking your funds, you feel the biggest
    0:35:06 test kid, you had to prove you was worthy of the sun, instead of bringing peace to the
    0:35:10 world, they brought hay and guns, put them in the kids’ heads, said have some fun, then
    0:35:13 prove to the world cops are just out of killing the dumb, while we’re all blinded by the government’s
    0:35:17 thumb, and you all fail to see we’re all kids of the sun, I’m on this world of righteousness,
    0:35:21 steady fighting the wicked, seems like growth, love and spirituality is extremely restricted,
    0:35:25 kids caught in the cycle, like to the shelter world it’s explicit, on your worst enemies
    0:35:29 you wouldn’t wish it, I know because I lived it, but this life is not a burden, nah, because
    0:35:33 I’m that seed planted in the garden of grief, they try to drown me statistically, put me
    0:35:37 through some shit you wouldn’t even believe, but instead of dying, I rose from the depths
    0:35:40 of despair, to breathe truths for share, soon I found myself the heir to the knowledge’s
    0:35:44 lair, now it’s up to me to train my mind, to learn how to share, many stop and stare,
    0:35:48 but not many opt to care, they’d rather shop and hate than bring these kids up and congratulate,
    0:35:52 designing our futures fate, and it doesn’t look pretty, so before the last grain of sand
    0:35:57 falls in God’s land, what will you build to grow on with sand, thank you.
    0:36:01 Thanks again for listening to this episode of the A16Z Podcast, and if you want to learn
    0:36:05 more about the Culture Leadership Fund, please visit A16Z.com.

    with Van Jones (@VanJones68), Shaka Senghor (@ShakaSeghnor), and Chris Lyons (@clyons)

    True redemption can be hard to come by in our justice system today. And yet, we need it more than ever before. In this episode (based on an event hosted by Andreessen Horowitz’s Cultural Leadership Fund), CNN news commentator and author Van Jones and Shaka Senghor, author of the New York Times bestseller Writing my Wrongs and director’s fellow of the MIT Media Lab, discuss the U.S. prison system; the human potential for redemption; and how we begin to go about normalizing restorative justice in our society.

    The conversation, introduced by a16z partner Chris Lyons, followed screening of an episode of Van Jones’ new series, The Redemption Project. The eight-part series looks at the families of victims of a life-altering crime as they come together to meet their offender; this episode featured the meeting between a police officer along with the man who shot him as a young boy of 17 years, decades earlier. The episode also includes two spoken word performances before and after the conversation, from two formerly incarcerated artists: first, Kevin Gentry, with ”My Heart”; and second, Missy Hart, with ”Bloom: A Trilogy.” Both are contributors to The Beat Within, a publication and organization that serves youth across California country juvenile halls and encourages literacy, self-expression, and community.