AI transcript
0:00:20 make you remarkable, and today we have a really remarkable guest, Total Badass, and I say
0:00:26 it as a compliment, Total Badass, so this is Meredith Whitakerz, Meredith, welcome to
0:00:27 the show.
0:00:28 Thank you, Guy.
0:00:31 I would like to be here with you.
0:00:33 Even though you’re on Paris time.
0:00:37 I am on Paris time, but that just means I’m likely to say something more interesting because
0:00:39 who knows what hour it is.
0:00:41 We offer you full editing privileges.
0:00:42 Oh, bless you.
0:00:43 Thank you.
0:00:49 I just, when I read about your background and I heard about you, like the most fascinating
0:00:54 thing, well, not the most fascinating, because there’s many fascinating things, but tell
0:00:56 me about the Google walkout.
0:00:57 I just love that story.
0:01:02 Have you ever been so indignant that you had to just pick up and do something?
0:01:06 Because that’s kind of the origin story of that.
0:01:07 Every day.
0:01:08 Yeah, right?
0:01:10 And you’re a doer, so I think we relate there.
0:01:15 That was, I was at Google since 2006, and that was a really different time in tech.
0:01:22 It was a really wild and fun place, and I learned most of what I know about tech at Google.
0:01:27 I was taking classes, I shared an office with Brian Kernahan, so I was asking him questions
0:01:34 and from the author of C programming language, he was explaining very kindly to a novice.
0:01:39 So I really came up inside of Google in terms of my tech education, and in some sense, I
0:01:40 think I really believed it.
0:01:41 There was don’t be evil.
0:01:46 There was these kind of principles, and there was this idea that you could be virtuous and
0:01:50 you could be successful, and I was part of that wave.
0:01:56 And as Google grew, as I could use an ungenerous term, metastasized, it became just as giant
0:02:00 with so many different divisions, that culture didn’t always follow.
0:02:05 And I think, you know, I was very dedicated, almost naive in some sense, really trying
0:02:06 to make change.
0:02:08 I had founded an AI research group.
0:02:14 I was speaking around the world about some of these harms and earnestly trying to change
0:02:18 the conversation and the direction in ways I thought were beneficial based on the force
0:02:20 of ideas, right?
0:02:25 And there was a time around 2017 when I realized a lot of this, they seemed to clap at the end
0:02:30 of my speeches, but then the decisions don’t change.
0:02:32 And the walkout was born out of that.
0:02:36 There was Google moving forward to be an AI military contractor.
0:02:41 A lot of the bad cultural trends were just toxic behavior was rewarded, which you would
0:02:46 see just eroding your teams, eroding the work that you were doing, just really this kind
0:02:49 of unforced error I felt.
0:02:50 And I wasn’t the only one who felt that.
0:02:55 So the walkout was a collective action where we all just said enough.
0:03:02 Was that around the time where our best friend got like $90 million after sexual harassment?
0:03:03 Yeah.
0:03:04 Yeah.
0:03:06 How do you even explain that to yourself?
0:03:09 You just fired a guy and he got $90 million.
0:03:15 That was the spark that lit the fire on the mom’s messaging group at Google.
0:03:17 And people were just like, are you kidding me?
0:03:19 I have been working nights and weekends.
0:03:22 I had not seen my baby, you know, all of this.
0:03:26 And then this guy who everyone knew was he drove success sometimes, but you work at a
0:03:29 company, you know the stories, right?
0:03:33 And it just became the representative for a lot of smaller harms and problems that people
0:03:34 felt.
0:03:36 But yeah, $90 million.
0:03:40 And we did the math and we were like, you know, a person, you know, a contractor working
0:03:44 minimum wage would have to work something like 3,000 years, right?
0:03:46 Every single day to make that.
0:03:51 So there was also just a feeling of inequity that really cut against the grain of the Google
0:03:54 that a lot of people felt they were promised.
0:03:55 No.
0:03:58 Wasn’t the walkout only about half an hour though?
0:04:00 Why such a short time?
0:04:04 Don’t actually remember the reasoning for some of that, but I think it was we’re efficient
0:04:09 people and it was how do you stage an action in a bounded way, right?
0:04:10 It was around the world.
0:04:15 We started in Singapore and I remember going to bed in New York and we had this Instagram
0:04:19 account set up to share photos from the walkout because every office was going to walk out
0:04:23 at 11.11 a.m. and we had a rolling thunder.
0:04:27 So you start in Singapore and it moves through the time zones as the sun rises, we’re walking
0:04:29 out, we’re walking out, we’re walking out.
0:04:31 And I remember looking, we didn’t know if it would work, right?
0:04:35 Maybe 13 people will show up, but like we’re going to do this.
0:04:38 There’s momentum, there’s energy, like let’s go.
0:04:41 And I looked at the photo going to bed in New York and there were hundreds of people in
0:04:46 Singapore and I was like, okay, Singapore popped off.
0:04:47 This is a thing.
0:04:51 And I was trying to sleep and then got to the park in New York in the morning and it just,
0:04:55 that was a day and now it’s five years ago.
0:04:56 And did Google retaliate?
0:04:57 Yeah.
0:04:58 Yeah, they did.
0:04:59 How did they do it?
0:05:07 They basically reshuffled my job responsibilities in a way that I couldn’t accept if I wanted
0:05:11 to pursue what I was doing and it’s classic.
0:05:15 I think there’s constructive termination is the labor law term for it.
0:05:18 So constructive termination, that’s an oxymoron.
0:05:19 Yeah.
0:05:20 Yeah.
0:05:21 It’s a lawyer language, right?
0:05:24 Where you’re like, the literal term doesn’t make sense, but I’m sure you know what that
0:05:25 means.
0:05:30 But it was, I was given the choice to basically become an administrator who did budgeting
0:05:33 for the open source office.
0:05:37 And I was like, I did found an AI research group and I have all this work and I’m wrecking
0:05:38 nice publicly for this.
0:05:41 So I’m obviously not going to take that option.
0:05:43 And you know, at some point you get tired of fighting, right?
0:05:45 It wasn’t personal to me, right?
0:05:48 It would, like a lot of people I really love and value at Google.
0:05:53 I was just like, if you, any human being, however much I love them, if you have that
0:05:56 type of responsibility in the world and that type of power, you need to live up to it.
0:06:00 And if you have real friends, they’re going to tell you that.
0:06:03 This do no harm thing is just total bullshit at this point.
0:06:05 I don’t know if it’s total bullshit.
0:06:10 I think there are a lot of people who feel it sincerely, but ultimately if the incentives
0:06:14 of these companies are driven by forever growth and forever profit and you have a board with
0:06:20 a fiduciary duty that is not to harmlessness, but to growth and profit, you are going to
0:06:22 prioritize those things.
0:06:29 And at some point, the far horizons of those ethical, ethically dubious choices that felt
0:06:36 so far away in 2006, like military contracting, like building a surveillance version of the
0:06:39 search engine for the Chinese market, et cetera, or maybe the American market in the future,
0:06:43 whatever it is, those horizons got closer and closer.
0:06:48 And the people filling Google, again, what is the objective function here?
0:06:49 Profit and growth.
0:06:54 And I think that’s with the type of power that the tech industry has now with centralized
0:06:58 surveillance, centralized information platforms.
0:07:02 We really need to be looking at that incentive structure because it’s not healthy.
0:07:06 I feel like the moment when I figured out there was no Santa Claus.
0:07:07 I know.
0:07:08 I know.
0:07:11 But you still got presents.
0:07:14 Or gift certificate at Amazon.
0:07:16 So now your signal.
0:07:18 Tell us about signal.
0:07:20 Signal is so cool.
0:07:24 I think it is the coolest tech organization in the world.
0:07:28 It is the world’s most widely used actually private communication service.
0:07:33 And it’s you can think about it almost as a nervous system for confidential communication.
0:07:34 Militaries use it.
0:07:35 Governments use it.
0:07:39 Any CEO who has a deal to make uses it.
0:07:41 Sports stars use it to broker their deals.
0:07:43 So it’s truly private.
0:07:51 And we stay truly private in an industry that pushes toward data collection by being a nonprofit.
0:07:57 How do you put two and two together like you’re it’s totally private but you’re helping militaries.
0:08:01 You’re helping the same military that might be killing people.
0:08:06 Look this is one of the facts about communications networks right.
0:08:11 And the facts about encryption and privacy we could also talk about in this context.
0:08:18 But signal has to be available for everyone to use for it to be useful for everyone whether
0:08:20 you’re a priest or a pedophile.
0:08:26 Whether you’re a type of human or another type of human right because ultimately I can’t
0:08:29 sell a telephone network just to the good guys.
0:08:33 I don’t know who’s going to be using that right.
0:08:37 And I think we also have to separate the infrastructure from the action.
0:08:39 There’s a human being making choices.
0:08:44 It’s not the roads that drove the car to commit the crime.
0:08:47 It’s a human being in a car who rode on those roads to go commit the crime.
0:08:50 We don’t license signal to any one of these entities.
0:08:55 Signal is free to use for everyone and we don’t have a military version or a CEO version
0:08:58 or we don’t have an enterprise business model any of that.
0:09:06 But the fact is that anywhere where confidentiality is valued by anyone signal is valuable.
0:09:12 And if I’m listening to this and I’m just a husband or a wife and I have teenage kids
0:09:16 is there an argument to be made that I should be using signal.
0:09:17 Yeah there is.
0:09:18 And what’s the argument.
0:09:21 I think the argument is one signal is pleasant.
0:09:23 It is lovely.
0:09:24 We’re not selling you ads.
0:09:25 We’re not farming for engagement.
0:09:30 You’re not going to get on a feed and fall into a whole of Instagram ads.
0:09:36 So there’s something crisp and clean and elegant about signal that I just want to you forward
0:09:41 before we talk about any of the values because it’s actually it’s a lovely simple app and
0:09:46 it takes us back to the days before that over saturation of everything’s a bot.
0:09:47 Everything’s an ad.
0:09:48 Everything’s a feed beyond that.
0:09:54 I think we do need to be serious about the times we live in right data can be indelible
0:09:55 want the data.
0:10:01 My Gmail from 2005 is stored in a Google server somewhere but our political context has moved
0:10:08 pretty dramatically since 2005 and we’re living in a time right now where there is a woman
0:10:15 living in prison because Facebook turned over messages between her and her daughter in Nebraska
0:10:20 after the Dobs decision and the Dobs decision is what kicked abortion rights down to the
0:10:25 state level and allowed states to criminalize it and they’re living in prison because they
0:10:30 discussed accessing reproductive care and dealing with that over Facebook messenger.
0:10:35 Those messages turned over to law enforcement and she’s probably cold right now in prison
0:10:36 waiting to get out.
0:10:42 There is a reason here that is really, really deep and it isn’t just a reason that exists
0:10:43 in the present moment.
0:10:49 We need to recognize that we are in very volatile times and I think of it simply as hygiene
0:10:50 right.
0:10:51 Why would you want that out there?
0:10:56 Why would you want every thought you thought ten years ago in a database that now may be
0:10:58 leaked or breached or turned over.
0:11:02 But I still need a network effect right.
0:11:06 I have to convince the other five people in my family to get on too.
0:11:07 Exactly.
0:11:12 And so like you know you went around the edges of this so basically if you are a woman in
0:11:17 Florida or Texas and you’re considering an abortion.
0:11:22 You shouldn’t use Gmail or Facebook messenger or anything right.
0:11:24 I wouldn’t Google it.
0:11:27 And what about the relationship with WhatsApp?
0:11:32 Isn’t your technology part of WhatsApp so is it safe to discuss abortion on WhatsApp?
0:11:36 This gets into threat modeling and all the nuance there but let’s quickly go through
0:11:40 the differences and similarities between WhatsApp and Signal.
0:11:46 So WhatsApp licenses the Signal protocol which is the core encryption that protects what
0:11:47 you say.
0:11:50 So anything you write in WhatsApp, WhatsApp can’t see it.
0:11:53 Only the people you’re talking to can see it.
0:11:57 And that’s great but what they don’t do is encrypt metadata.
0:12:03 They aren’t keeping facts about who you talk to, who’s in your contact list.
0:12:06 When you start talking to someone, when you stop talking to them, who’s in your group
0:12:11 chats, really important intimate information is not protected by WhatsApp.
0:12:15 And that’s a key differentiator between WhatsApp and Signal minus the fact that WhatsApp is
0:12:17 also part of meta and meta.
0:12:21 You could join database join with Facebook data.
0:12:22 You get into some…
0:12:27 Here comes Cambridge, yeah.
0:12:30 And how do you pay the bills at Signal?
0:12:34 Donations and that’s going out to all the listeners who might be looking for a righteous
0:12:35 tech cause.
0:12:37 We are a non-profit.
0:12:41 We are funded by donations and this is not a nice little philanthropic outfit.
0:12:48 This is because we looked at the hard facts and we realized the core business model of
0:12:52 communications tech in this day and age is collecting data.
0:12:55 You model that data to sell ads.
0:12:57 You sell that data to data brokers.
0:13:02 You use that data to farm engagement to sell ads or train AI models.
0:13:07 And so if we were governed by that type of fiduciary duty to the shareholders and we
0:13:11 were trying to stay private, we’re basically rowing up against this business model and
0:13:16 it would be beholden on our board probably at some point to be like, “Hey, can you cut
0:13:19 some of this privacy stuff out cause we got to make some money.”
0:13:26 So we’re very self aware in that regard and we’re looking to shape new models for tech
0:13:32 that are actually breaking that surveillance business model and creating a more independent
0:13:33 ecosystem.
0:13:37 In a sense, are you like the Wikipedia of messaging then?
0:13:41 I think the similarities is that we’re a non-profit and Wikipedia is a non-profit.
0:13:44 I think of Wikipedia as sort of an online library.
0:13:45 So it’s a different …
0:13:49 Yeah, in terms of revenue models, donations, yeah.
0:13:50 Donations.
0:13:52 And we’re exploring other models as well.
0:13:54 Are there endowments that we could set up?
0:13:59 Are there hybrid or tandem forms of profit, non-profit?
0:14:05 Yeah, the absolute primary objective here is we need to shield signal from the pressures
0:14:08 of surveillance, from the pressures of data collection.
0:14:15 And what happens if the Koch brothers or Elon Musk or the Gates Foundation decides we want
0:14:18 to give you a $1 billion donation?
0:14:21 I sit at the table and we talk it through, right?
0:14:22 So what are the terms of that?
0:14:25 Signal must stay open source.
0:14:26 It must stay independent.
0:14:28 It must stay private.
0:14:31 It is laser focused on its mission.
0:14:34 And so, all right, let’s talk about where that billion dollars goes.
0:14:36 I think, yeah, is that an endowment?
0:14:38 How do we work with that?
0:14:43 But it wouldn’t come with control or the ability to inject code into signal.
0:14:47 And our principles would remain really steadfast.
0:14:52 And one of the things that’s super lucky about us is we came up in a very different tech,
0:14:56 Moxie founded Signal in the 2000s.
0:15:01 And it developed because of the virtuosic work of Moxie and Trevor on the protocol and
0:15:06 a lot of the development work, a community of experts have formed around Signal.
0:15:12 So our protocol and the encryption implementation that we use are open and they’re scrutinized
0:15:15 by thousands and thousands of hackers and Infosec folks.
0:15:20 We have a security community that is a train spotter for signal codes.
0:15:23 And we cut a new branch and it’s in GitHub.
0:15:25 We have people on the forums and Reddit looking at it.
0:15:32 So there’s an immune system that we really, really value out there that would call BS
0:15:36 on any move and we would listen to them because we really value them.
0:15:42 I was on the board of trustees of Wikipedia and what you just described is very similar.
0:15:45 The Wikipedians are like them.
0:15:52 I thought I knew evangelism because of Macintosh people, but oh my God, Wikipedia blows them
0:15:53 away.
0:15:54 Yeah.
0:15:58 And we should trade notes sometimes because there’s some, there’s some men in my mentions.
0:16:14 Very passionate.
0:16:17 If I’m a young, I don’t have to be young.
0:16:21 But if you’re an engineer and you’re thinking, wow, I’d like to go and join this Signal team,
0:16:26 I know the answer, but I’m going to ask you, should I think, oh, not for profit based on
0:16:30 donations means I’m not going to get paid a lot.
0:16:32 So how does this work?
0:16:35 Let me assure you, young engineer, we pay very well.
0:16:39 We pay as close to industry salaries as we can.
0:16:42 We have certain benefits that I think are non-monetary.
0:16:44 So it’s, we are a remote organization.
0:16:46 We work in U.S. time zones, but remotely.
0:16:48 So you have that flexibility.
0:16:50 And I do think it really matters.
0:16:52 We’re a very high caliber team.
0:16:57 So you enter in there, you’re literally shaping core infrastructure that human rights groups
0:16:59 and journalists rely on.
0:17:02 I got off a phone with a publisher this morning that I was talking about around a book I’m
0:17:04 working on.
0:17:08 And she was talking about, she published Snowden’s book and she’s like, we couldn’t have done
0:17:12 that without Signal because we had to be communicating sensitive information, right?
0:17:14 So you’re contributing to something that really matters.
0:17:18 And you personally on this small team have a real impact.
0:17:20 So I think there’s experience you gain.
0:17:24 And then there’s just, we have one magical life in this world.
0:17:25 So how do we want to spend it?
0:17:29 And I think, I think a lot of people are weighing that right now in this weird time you live
0:17:30 in.
0:17:35 And in a very sick way, I would make the case that if Donald Trump got elected, the signal
0:17:38 will explode because Donald Trump gets elected.
0:17:44 He makes Elon Musk Secretary of Efficiency and all of a sudden, man, you really got to
0:17:46 be careful what you say.
0:17:47 Yeah.
0:17:48 Yeah.
0:17:51 And what you said, because that data is still there on those platforms, right?
0:17:56 And this is why I say, moms, teens, it doesn’t matter if you have anything to hide.
0:17:59 You don’t want to be that weak link in your network.
0:18:05 So speaking of this kind of threat, I understand the difference between Signal and other things,
0:18:12 that a CEO was just arrested, right, in France, of all places.
0:18:17 And now, couldn’t some government make the case, well, there’s drug dealers and pedophiles
0:18:21 on Signal, so we’re going to shut that down and we’re going to arrest you?
0:18:23 They do make that case sometimes.
0:18:24 Not arrest me.
0:18:25 We haven’t gotten to that point.
0:18:32 I think just addressing the telegram situation quickly, there are vast differences.
0:18:37 So telegram is basically an unmoderated social media platform that has messaging bolted
0:18:38 onto the side.
0:18:39 They’re not encrypted.
0:18:41 They do have the data.
0:18:44 And under European regulations, there’s a very particular threshold that social media
0:18:50 platforms of their size with the data they have and the public broadcast features they
0:18:52 offer have to meet.
0:18:53 Signal is only messaging.
0:18:56 We don’t have any social media broadcast functions.
0:18:57 You don’t have a directory.
0:18:58 You can’t find your friends.
0:19:01 So we actually think about how do we avoid those thresholds?
0:19:07 How do we avoid culpability here so that we are building something that’s very pure and
0:19:09 not subject to those laws?
0:19:13 So the terms of the arrests are very unlikely to hit Signal.
0:19:18 However, I think it is notable that executives of core infrastructure tech companies are
0:19:21 now on the playing board in that way.
0:19:26 And we do have to be aware of that in a geopolitically fractured world.
0:19:28 And I think it’s notable that there is a war on encryption.
0:19:33 There’s been a war on encryption since 1976 and before when Whit Diffie and Martin Hellman
0:19:37 tried to publish their paper on public key cryptography and the US government freaked
0:19:39 out and tried to stop the publication.
0:19:41 We went through the crypto wars in the 90s.
0:19:45 After Snowden, you saw full disk encryption on iOS and Android.
0:19:49 And then suddenly, 2015, there was a kind of manufacturer crisis where James Comey tried
0:19:51 to browbeat Apple.
0:19:53 And the pretext then was terrorism.
0:19:55 And now you have new pretexts around child safety.
0:20:01 But again, the target is always encryption and the ability for everyday people, dissidents,
0:20:08 organizers, anyone to communicate privately outside of government and corporate scrutiny.
0:20:11 You’re not giving me peace of mind, Merritt.
0:20:15 That’s not what I serve, guy.
0:20:20 I want to know, going back into your checkered past, tell me.
0:20:22 What houndstooth.
0:20:27 What is it like to testify in front of Congress?
0:20:29 It’s really stressful.
0:20:34 Like you’re amygdala is firing, you’re thinking about your posture.
0:20:38 And then suddenly it’s over and you’re like, toddling out onto the street trying to find
0:20:39 an Uber.
0:20:42 But you prep, you prep, you prep, you prep, you prep.
0:20:44 You have all of your answers there.
0:20:46 You have to make sure your numbers are right.
0:20:48 So I remember having these little sheets.
0:20:52 I had my flash cards with what are the things I’m going to cite.
0:20:58 I felt when I did it, a real responsibility, right, because I am, I’m not just speaking
0:20:59 for Meredith.
0:21:03 I’m trying to get this message through and I’m trying to get it through to people who
0:21:05 are probably not thinking that hard.
0:21:07 They probably didn’t read my opening statement.
0:21:11 They’re staffed, they handed them some talking points.
0:21:14 But if I mess up, that’s the clip they’re going to pick, right?
0:21:16 And that’s the narrative they’re going to remember.
0:21:20 So it does feel to me like it’s something you really prep for.
0:21:25 And then it’s over and then you sleep on the Osella back to New York.
0:21:29 I would love to see you testify in front of Jim Jordan.
0:21:31 I think you would have him for breakfast.
0:21:32 I did.
0:21:33 I did actually.
0:21:34 What’s Jim Jordan?
0:21:40 Oh, yeah, Jim Jordan was there and he was trying to get at, he was like, is Amazon political
0:21:43 because he was trying to get it like tech is anti, you know, whatever this thing.
0:21:46 And I remember hearing that question and I was like, oh, he wants me to say like tech
0:21:49 is biased against the right or what have you.
0:21:51 And I sat with it and I was like, I don’t know how to answer that.
0:21:52 That’s a trap.
0:21:56 And I was like, well, they do hire many lobbyists.
0:21:57 And he got so mad at that answer.
0:22:04 He just stopped questioning me, but yeah, I go for round two.
0:22:06 Let’s talk about AI now.
0:22:07 All right.
0:22:12 So first of all, what was your reaction when there was this movement, let’s pause AI for
0:22:13 six months.
0:22:16 I think unserious is the word that comes to mind.
0:22:18 I get it, right?
0:22:19 But one, what is AI?
0:22:21 We got to answer that question.
0:22:23 And as a pet, I often start there.
0:22:29 And two, it was a very splashy statement that certainly it generated a lot of heat and a
0:22:30 lot of smoke.
0:22:36 But the reasoning for pausing AI, what a pause meant in a world where is it developing a
0:22:39 core library that then becomes part of an AI system?
0:22:44 Is it collecting data that then is aggregated and clean to train an AI model?
0:22:48 Is it developing an NVIDIA chip that is going to accelerate trains?
0:22:49 What is AI here?
0:22:50 And what do you mean by pause it?
0:22:53 Again, it didn’t feel serious to me.
0:22:59 And clearly it wasn’t because here we are in the moment of increasing acceleration.
0:23:05 And the people talking about that were seemingly intelligent, experienced being like, seriously,
0:23:08 what does it mean to pause development?
0:23:10 You turn off all the computers.
0:23:11 What is it?
0:23:16 I’m not like, I’m going to stop harvesting redwood trees for six months.
0:23:17 That I can understand.
0:23:18 Exactly.
0:23:19 That was my question as well.
0:23:21 Are we talking about Larry and Jihad?
0:23:23 What are we talking?
0:23:25 And again, this isn’t a game, right?
0:23:30 Are you sincerely worried about these threats when a five line letter that is sent to the
0:23:35 New York Times with no specificity is not actually a theory of change, right?
0:23:36 What are we doing here?
0:23:39 So I have a marketing question for you.
0:23:41 I want to tap that no one has been able to answer for me.
0:23:49 I have a folder of Claude and Perplexity and OpenAI and all that, like five of them.
0:23:52 And I have zero brand loyalty to any of them.
0:23:58 So how do these companies create brand loyalty to an LLM?
0:24:06 I think the AI market is interesting because ultimately it’s contingent on one’s compute,
0:24:09 so servers, and two data, right?
0:24:10 And so we know this well.
0:24:16 There are a handful of large companies that came up, two thousands, established platforms
0:24:20 and cloud businesses, and they now dominate.
0:24:26 And ultimately the path to market, whatever the app or the LLM is through them.
0:24:28 So a good example is Mistral in France.
0:24:34 They build open source models, large language models, open source, they’re kind of a national
0:24:35 champion in France.
0:24:36 I’m sure many of the listeners know them.
0:24:41 They do really interesting work, but they can’t just IPO, right?
0:24:42 There’s a model.
0:24:43 What are you going to do with that?
0:24:46 You can post it on Hug & Face, but that’s not a business model.
0:24:47 What do they need to do?
0:24:51 They need to find market fit, and you either do that by licensing it to one of the cloud
0:24:52 giants.
0:24:56 So Google, Amazon, Microsoft, and what they did is license it to Microsoft, and now people
0:25:01 who want to sign an Azure contract can sign up for a Mistral API.
0:25:04 Or you go through one of the platforms.
0:25:05 This is Meta, right?
0:25:10 You could be acquired by Meta, and they integrate your AI into their platforms for news feed
0:25:14 calibration, for advertiser services, for whatever it is.
0:25:19 And this also helps explain the open closed AI debate, because of course the platform
0:25:21 companies want proprietary models.
0:25:25 Because if they have them and they’re licensing them to you, that’s a market advantage.
0:25:28 Sorry, the cloud companies want proprietary models.
0:25:31 So you’re signing up for an Azure contract.
0:25:34 You can only get that through Azure or what have you.
0:25:36 But the platform companies want to integrate this.
0:25:40 Their market is integrated into the platform that you’re looking at all day.
0:25:41 So they want open models.
0:25:45 They want to be able to harvest from the work of people who are building on top of these
0:25:46 open LLMs.
0:25:51 So I think there’s interesting market dynamics that help us dig into questions like that.
0:25:55 A few minutes ago, you alluded to the fact you’re writing a book.
0:25:57 So what is this book?
0:26:03 I just sold it, and it’s kind of an alternative history of tech that starts it with Charles
0:26:04 Babbage.
0:26:08 You’re going way back.
0:26:10 You did ask if I were crazy.
0:26:12 And I’ll say maybe there’s a slight symptom of that.
0:26:18 But I’ve done a lot of research on this, actually, and for me, I get a lot of joy in spending
0:26:20 time in the archives, spending time with ideas.
0:26:25 There’s an itch that gets scratched every time something comes together that I didn’t
0:26:26 understand.
0:26:31 And I’m like, wow, OK, now I get something, or whoa, I had completely misunderstood that.
0:26:36 I spent some time looking at the relationship between the industrial revolution as it’s
0:26:43 called computation and the age of abolition and this period when Britain was looking to
0:26:50 and then did, in some sense, abolish slavery and plantation technologies that were then
0:26:55 imported into the industrial revolution and actually informed the blueprints for computation
0:26:57 as a side project, as a treat.
0:26:59 And you’re doing this while you’re running signal.
0:27:03 Well, I’ve done some of this research before, so I spent about two years reading through
0:27:04 this and getting that.
0:27:08 But this is how I spend my weekends, and I love it.
0:27:12 I’m an author also, and I’m just curious about your attitudes.
0:27:18 I have written 16 books, or some people say I wrote one book 16 times, but I have made
0:27:24 a concerted effort to get them into every LLM that I can.
0:27:28 There are many authors who have the exact opposite reaction, which is I don’t want my
0:27:32 stuff in LLMs because I’m not going to get rewarded for it.
0:27:35 They’re just going to take my work and intellectual property.
0:27:42 So where are you on this kind of, is your book going to be out there in LLMs so that
0:27:45 when people ask chat, GPT, what did Charles Babbage do?
0:27:47 It’s going to cite you.
0:27:51 I imagine the second someone posts it to LibGen, it will be in an LLM.
0:27:55 So it’s unclear that I would have that much control over that given the web scraping that
0:27:59 is generally creating the data sets that LLMs use and the fact that they’re too big to
0:28:01 be auditable.
0:28:05 That aside, I think the question generally to me is less like, is it in?
0:28:07 Is it not in?
0:28:08 What is the intellectual property argument?
0:28:09 Is it fair to use?
0:28:11 That’s fine.
0:28:19 I’m really just interested like, are we cultivating an economic system in which creativity and
0:28:23 intellectual work and art continue to be rewarded?
0:28:26 Who is getting paid is my question.
0:28:32 And if the answer is only Microsoft, then I don’t really care what we call it.
0:28:36 That is not a system that I want to endorse because I went through art school.
0:28:43 I think art and writing and language are the way that we’re able to express our place in
0:28:44 the world to each other.
0:28:48 It’s so core to human life and human flourishing.
0:28:53 And I dread a world in which there’s no reward for that work, in which it’s just reproduced
0:29:00 by massive companies or sort of a simulacra of that work is produced by these models because
0:29:05 I don’t think we thrive as human beings without that.
0:29:11 You could not find someone who is more optimistic about the impact of AI on society.
0:29:14 I, in fact, think AI could save society.
0:29:16 So that’s where I’m coming from.
0:29:19 But I want you to explain, “Guy, you’re being naive.
0:29:23 These are the existential threats that AI provide if you believe this.”
0:29:25 So what are the threats?
0:29:31 I want to look away from AI as a technology and a vacuum to explain where I come from
0:29:37 here because, obviously, finding patterns in large amounts of data, super useful, assuming
0:29:42 the data is good, assuming the decision makers who are acting on those patterns are benevolent,
0:29:44 all of that.
0:29:48 But right now, when we’re talking about AI, we’re talking about these massive models.
0:29:53 They rely on huge amounts of compute, and you see this sort of bank-busting data center
0:29:54 build-outs, all of this.
0:29:58 We haven’t even talked about the environmental impact, but that is, we’re reopening Three
0:29:59 Mile Islands.
0:30:01 We’re in some weird waters.
0:30:02 What a concept.
0:30:03 Microsoft running Three Mile Island.
0:30:06 As a New Yorker, I’m like, “Excuse me.”
0:30:10 And then we’re talking about the need for huge amounts of data, the kind of data that
0:30:14 the platform companies and a handful of other companies have, and most people don’t.
0:30:23 So my concern is with centralized power and the way that AI, as a general-purpose utility
0:30:28 threaded through our lives and institutions around the world, could enable those with that
0:30:34 power to shape, reshape, and control our lives in ways that are not beneficial.
0:30:40 And so I want to look at that and think about, is this healthy and safe, given that the incentives
0:30:46 driving these companies are profit and growth, and not necessarily benefit.
0:30:52 At some level, Apple has one of the most compelling stores of AI because, literally, it can be
0:30:54 at the system software level.
0:30:57 It’s not something people go out and get.
0:30:59 It’s in every form.
0:31:04 So is Apple the best thing for AI or the worst thing for AI?
0:31:10 Apple is doing the on-device model, which means that there’s less leakage, let’s say.
0:31:11 But I think let’s get into that.
0:31:14 What is the core of what is happening there?
0:31:15 Apple trains a model.
0:31:20 It’s small enough to run on your device, and we need to be clear these large LLMs and generative
0:31:24 models are not that small, which is part of Apple trying to figure out a server-side private
0:31:27 server arrangement for the open AI deal.
0:31:29 But nonetheless, there’s on-device AI.
0:31:31 It’s small enough to run.
0:31:35 But it’s also oftentimes making really key decisions.
0:31:38 So do you want to read this email or not?
0:31:43 I’m not coming up with a hypothetical, but scanning your photos and saying, this is bad,
0:31:46 this is maybe you don’t want to send this, whatever.
0:31:53 And I think it’s more private, but you’re still giving Apple sort of an obscure power
0:31:57 to make decisions and determinations that I think we need to look at in the context
0:32:01 of this privacy and agency conversation.
0:32:06 If anybody believed that Google was going to do no harm, you should ask yourself if
0:32:10 you believe Apple’s going to do no harm too, because obviously things can change.
0:32:15 Well, I know WhatsApp, Signal, and others were pulled from the App Store in China, right?
0:32:21 And that’s not a, I completely understand, companies have to work within the laws of
0:32:22 certain governments.
0:32:27 But nonetheless, I think we can’t treat these companies and their incentive structures as
0:32:28 good or bad.
0:32:31 We have to recognize they’re going to be compelled to do certain things under certain
0:32:38 conditions, and we need to create structures and systems that act as prophylactics from
0:32:40 the harmful decisions, right?
0:32:42 And this is why I’m always looking at the structural level.
0:32:46 I’m always looking at the system, even, you know, I love people.
0:32:47 I’m very easy on people.
0:32:50 I’m very hard on ideas and systems, right?
0:32:54 Because I think we need to build it for robustness and we need to build it to make sure that
0:33:17 massive power and responsibility is not misused.
0:33:25 Let’s suppose that I am a parent, or I am a young girl, and I’m listening to this podcast,
0:33:31 and I’m saying to myself, or I’m saying to myself for my kids, I want my daughter, or
0:33:35 I want myself to be like Meredith.
0:33:40 So now with everything you know, how does someone become a woman?
0:33:44 How does a woman become a leader like you today?
0:33:49 Oh, well, I would just say you all can become better leaders than me.
0:33:56 I think there’s no recipe, but I was very lucky to get a lot of good mentorship, and
0:33:59 I think find your mentors.
0:34:01 And then I entered into tech, not knowing that much about tech.
0:34:03 I have a humanities background.
0:34:05 I went to art school for most of my life.
0:34:09 I still love that world, but I like said yes to everything.
0:34:11 I signed up for everything.
0:34:16 I tried to learn everything, and I didn’t quite understand there were rooms I wasn’t
0:34:17 supposed to be in.
0:34:18 I just walk in.
0:34:23 There were tables I wasn’t supposed to be in, and I think it’s not a secret, but if
0:34:26 you can get in, figure out where you fit.
0:34:30 I would take notes at meetings I wasn’t supposed to be in, or I wasn’t invited to, or I would
0:34:33 join initiatives to try to figure out, is there a place I can help?
0:34:35 Okay, I can help by ordering the catering.
0:34:36 That’s a helpful thing.
0:34:40 And then suddenly I was in the other meeting, I’m like, hey, do these things connect?
0:34:41 Oh, that was a good idea.
0:34:42 I wouldn’t recognize I had a good idea.
0:34:44 Okay, now I’m going to be part of doing this.
0:34:50 And I think there was a stubbornness and a sort of elbowing everyday mentality that just
0:34:51 I didn’t grow up in the elite.
0:34:54 And so I think if you don’t come from this world, don’t worry.
0:34:58 There’s a lot you can do with a little bit of street smarts and a willingness to wedge
0:34:59 your way into the room.
0:35:02 Did you have to get over the imposter syndrome?
0:35:05 In a sense, I was an imposter because there’s no reason I was supposed to be there, right?
0:35:11 Like an art school kid with a humanities degree who plopped in the middle of building forties
0:35:12 at Google, right?
0:35:17 But then I was like, okay, well, I’m, I’m going to just be my own version of myself.
0:35:20 Because clearly I don’t fit in with the hoodie Stanford culture.
0:35:22 I’m going to try to figure out what I bring.
0:35:27 And so I also don’t think, don’t let the normative terms of whatever environment you’re in define
0:35:32 whether you are, whether you fit or not, make them fit to you.
0:35:37 This may seem like a dumb question, but then if you were not the Stanford hoodie culture
0:35:42 with a PhD in computer science, how the hell did you get into Google at all?
0:35:47 Well, at the time I was hired, if you had a super high GPA from a social sciences or
0:35:50 humanities background, they were hiring us to do, I was basically hired for customer
0:35:55 support, although they didn’t call it that they called it consumer operations associate.
0:35:59 And I remember getting the call and I was like, I don’t have any idea what that is,
0:36:01 but it sounds like a business job.
0:36:02 Okay.
0:36:03 And I need to pay rent.
0:36:06 I just graduated Berkeley and I was like, all right, I’ll take this call.
0:36:13 And then I had seven interviews, two kind of personality IQ tests and a writing test.
0:36:19 And I knew I was getting close to being hired because the size of the diamonds on the engagement
0:36:22 rings of the women who were interviewing me kept getting bigger.
0:36:26 So I was like, must be going up the chain.
0:36:30 And then I got in and then I was like, oh, there’s all kinds of stories.
0:36:32 But again, I was like, I don’t know what this job is supposed to be, so I’m going to try
0:36:36 to make it, you know, they have 20% projects, they have all this stuff.
0:36:39 I’m like taking the bike around campus, meeting everyone during the office hours.
0:36:43 Did you have to answer the question, like how many manhole covers are there in the United
0:36:44 States?
0:36:48 No, it was ping pong balls in the 747.
0:36:52 And I was like, what the fuck, like is this a cult?
0:36:55 What are we doing here?
0:37:00 And now at Signal, how does Signal interview, you’re not doing stuff like that?
0:37:01 No, we don’t do that.
0:37:02 It’s a little excessive.
0:37:07 And I think it’s more of a flex than an actual methodology for rigorous talent finding and
0:37:09 a flex in a pejorative sense.
0:37:14 Well, flex in like, we are big enough and desirable enough, we can make you jump through
0:37:18 as many hoops as we want, even for an entry level job.
0:37:20 And look, I like that job was really cool.
0:37:22 They hired really rad people.
0:37:29 So it was this like bullpen of like hyper achieving humanities and social sciences kids who would
0:37:32 get all their work done in an hour and then just bounce around.
0:37:37 It was a really, it was a wild environment.
0:37:39 There is some irony.
0:37:44 You sitting in this chair at this conference about the masters of scaling, because one
0:37:50 of the concepts of Reed is that when you’re scaling, you put in these things that are
0:37:55 not exactly humanitarian and warm and fuzzy and you look at GPAs and you look at degrees
0:38:01 and you look at that and you’re just higher, higher, higher, higher to scale.
0:38:05 And in a sense, you’re saying, look at all the problems that can create.
0:38:06 I think so.
0:38:09 I mean, small is also a scale, right?
0:38:11 So what is the problem we actually want to solve?
0:38:13 What is it we want to do in the world?
0:38:14 And how do we be discerning about that?
0:38:19 I don’t think one size fits all doesn’t always fit.
0:38:24 This interview has turned into a big ad for signal, which I’m okay with.
0:38:26 I’m doing my job.
0:38:28 You’re a signal evangelist.
0:38:30 I genuinely love it.
0:38:36 So since we’ve gone that far, I’m going to just let you for the next 30 or 60 seconds.
0:38:42 Just give this plug for signal, why people should use signal or join signal if they are
0:38:43 looking for employment.
0:38:45 So this is, this is your ad.
0:38:46 Amazing.
0:38:49 Signal is a really special organization.
0:38:53 I think it takes us back to an earlier day in tech where the cool people in the weirdos
0:38:55 had their moment.
0:38:56 And I think we’re going back there.
0:39:05 By the way, I also think signal is, it is at the cusp of a movement and a growing awareness
0:39:07 that we need to change the models for tech.
0:39:13 So if you want to be part of building a new model for healthier, cooler, more private,
0:39:20 more surveillance, less harmful tech, signal is not only doing that, but it’s shaping
0:39:23 a model for how we can do that across the ecosystem.
0:39:30 To quote my colleagues, Maria Farrell and Robin Bjorn, to rewild the tech ecosystem.
0:39:36 So we don’t just have a handful of giants, you know, consuming all of our data and producing
0:39:40 products we may or may not like because they have OKRs they have to meet.
0:39:46 We actually have a teaming ecosystem of really smart solutions built on open source and open
0:39:52 protocols that are actually private, that are swimming upstream and doing it successfully.
0:39:57 If you’re into bad ass shit, I think you’re into signal.
0:40:02 I must say this is about the 260 episode of remarkable people.
0:40:08 And this is the only one that basically turned into an ad for the guests company.
0:40:10 I should be proud of that.
0:40:11 Thank you.
0:40:15 I’m glad it worked out that way because again, the ad is coming from the most sincere place
0:40:16 of my being.
0:40:21 Your PR person over there is going to say, oh my God, Meredith, you just hit it out of
0:40:22 the park.
0:40:23 Thank you.
0:40:24 That’s for you.
0:40:25 You know who you are.
0:40:26 All right.
0:40:34 Listen, this has been the Remarkable People podcast and I hope you appreciated this ad
0:40:41 for signal and Meredith Whitaker and particularly if you’re a woman and want to emulate her.
0:40:47 I think you learn a lot about climbing the ladder and kicking ass.
0:40:48 Thank you guys.
0:40:49 It’s been just a delight.
0:41:01 And thank you everyone who listened to this non-targeted ad.
0:41:03 This is Remarkable People.
In this episode of Remarkable People, join Guy Kawasaki for an illuminating conversation with Meredith Whittaker, President of Signal and former Google AI ethics researcher. Discover how she led the historic Google walkout, her vision for private communication technology, and her critical perspective on AI’s impact on society. Whittaker shares insights on leadership, challenging tech industry norms, and building ethical alternatives to surveillance-based business models.
—
Guy Kawasaki is on a mission to make you remarkable. His Remarkable People podcast features interviews with remarkable people such as Jane Goodall, Marc Benioff, Woz, Kristi Yamaguchi, and Bob Cialdini. Every episode will make you more remarkable.
With his decades of experience in Silicon Valley as a Venture Capitalist and advisor to the top entrepreneurs in the world, Guy’s questions come from a place of curiosity and passion for technology, start-ups, entrepreneurship, and marketing. If you love society and culture, documentaries, and business podcasts, take a second to follow Remarkable People.
Listeners of the Remarkable People podcast will learn from some of the most successful people in the world with practical tips and inspiring stories that will help you be more remarkable.
Episodes of Remarkable People organized by topic: https://bit.ly/rptopology
Listen to Remarkable People here: https://podcasts.apple.com/us/podcast/guy-kawasakis-remarkable-people/id1483081827
Like this show? Please leave us a review — even one sentence helps! Consider including your Twitter handle so we can thank you personally!
Thank you for your support; it helps the show!
See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.