Summary & Insights
0:00:10 tax or investment advice or be used to evaluate any investment or security and is not directed
0:00:14 at any investors or potential investors in any A16Z fund.
0:00:18 For more details, please see a16z.com/disclosures.
0:00:21 Hi everyone, welcome to the A6NZ podcast.
0:00:22 I’m Sonal.
0:00:26 I’m here today with a very special guest visiting Silicon Valley, the former prime minister
0:00:32 of the United Kingdom, Mr. Tony Blair, who now runs an institute for global change working
0:00:35 with governments, policymakers and others all around the world.
0:00:39 Also joining us, we have Andreessen Horowitz, managing partner Scott Cooper, who has a new
0:00:44 book just out called Secrets of Sandhill Road, Venture Capital and How to Get It.
0:00:49 And given that startups are drivers of economic growth and innovation, Cooper also often weighs
0:00:54 in on various policy issues, especially those that affect the flow of capital, people and
0:00:55 ideas around the world.
0:00:58 And that’s the focus and theme of this episode.
0:01:02 It’s more of a hallway-style conversation where we invite our audience to sort of eavesdrop
0:01:04 on internal meetings and convos.
0:01:09 We discuss the intersection of governments and technology and where policy comes in,
0:01:13 focusing mainly on the mindsets that are required for all of this.
0:01:17 But then we do also suggest a few specific things we can do when it comes to supporting
0:01:21 tech change for the many, not just for the few.
0:01:22 Welcome Tony.
0:01:23 Thank you.
0:01:24 Did you ask me to call you that?
0:01:25 Thank you.
0:01:26 Can everyone know?
0:01:27 He said it was okay.
0:01:28 And Cooper, welcome.
0:01:29 Thank you.
0:01:30 So let’s just get started.
0:01:34 I think the context is that there’s so much discussion right now about tech in the context
0:01:35 of inequality.
0:01:40 One of the points of view that I have, particularly coming from a background where my family came
0:01:44 from India, et cetera, is that it’s also very democratizing.
0:01:49 And a lot of people can do new things in better ways because of technology.
0:01:53 But I think the big question, the question I think we care about today is how do we bring
0:01:58 more people into the system and make sure that tech benefits everyone?
0:02:03 The first thing I would say from the political perspective is that technology is essentially
0:02:04 an empowering and enabling thing.
0:02:08 So I regard it as benign, but it’s got vast consequence.
0:02:10 So the question is how do you deal with the consequence?
0:02:14 How do you access the opportunities and mitigate its risks and disbenefits?
0:02:17 So that is, I think, the right framework to look at it.
0:02:24 But because it’s accelerating in its pace of change and because the change is so deep,
0:02:28 and I look upon this technological revolution as like the 21st century equivalent of the
0:02:32 19th century industrial revolution, it’s going to transform everything.
0:02:37 So I think the fundamental challenge is that the policy makers and the change makers are
0:02:39 not in the right dialogue with each other.
0:02:44 And this is where misfortune will lie if you end up with bad regulation or bad policy and
0:02:49 where the tech people kind of go into their little huddle, because I say this with great
0:02:55 respect, but you come to this Silicon Valley and it’s like walking into another planet,
0:02:56 frankly.
0:02:57 Yes, that’s actually really interesting.
0:03:00 I’m personally offended by that comment.
0:03:03 Now I think the difficulty is that, yes, you’re right, it’s very empowering.
0:03:07 On the other hand, it’s actually quite frightening to people because you kind of all understand
0:03:11 it and the rest of the world doesn’t quite understand it.
0:03:16 And as far as they do understand it, they find it somewhat dystopian and it’s look.
0:03:19 I was actually sitting with some people from my old constituency in the north of England
0:03:24 a few months back and I said to them, I wonder what’s going to happen when we have driverless
0:03:28 cars and their attitude was, it’s never going to happen.
0:03:34 And the role of a politician is to be able to, in a sense, articulate to the people those
0:03:37 changes and then fit them into a policy framework that makes sense.
0:03:41 And that’s the worry because if the politicians don’t understand it, they’ll fear it.
0:03:43 If they fear it, they’ll try and stop it.
0:03:46 You articulated the vision that we’ve always had, which is we’ve always invested around
0:03:48 this theme called software is eating the world.
0:03:53 It’s exactly what you describe, which is technology no longer kind of sits in its own box.
0:03:57 It really is the case that technology will permeate almost every industry over time.
0:04:01 I think that’s where the big change is happening now is it used to be that technology was a
0:04:02 piece of the puzzle now.
0:04:04 Every company is a technology company.
0:04:05 Yeah, exactly.
0:04:07 So that is the kind of board on which people are playing.
0:04:11 So the issue, I think, is this, is how do you get the structured dialogue between the
0:04:13 change makers and the policy makers?
0:04:16 What would you say the number one thing if you could give advice to entrepreneurs in
0:04:21 the Valley that they should do differently to engage this kind of a framework that you’re
0:04:22 describing?
0:04:26 My advice would be stop looking at your own narrow interest in what you’re doing and
0:04:31 understand you’ve got a collective interest in making the world of policy and politics
0:04:33 understand the technology.
0:04:34 What’s going to happen?
0:04:41 How you get A, the right system of regulation and B, how you allow government to enable
0:04:43 these transformative changes.
0:04:44 Yes.
0:04:45 Well, I actually have a question for both of you.
0:04:48 Today is the 30th anniversary of the web, the World Wide Web.
0:04:52 And I just had Google Doodle this morning and a note from Tim Berners-Lee, his original
0:04:55 memo, information management, colon, a proposal.
0:04:59 The question I have is that a lot of people would argue that the best technologies can
0:05:04 develop when you don’t try to, A, priority predict the consequences because, A, you cannot
0:05:09 their complex adaptive systems and, B, there was an environment of quote permissionless
0:05:13 innovation that allowed the web to thrive because the original makers may have foreseen
0:05:17 some apps, but the whole point is that the innovation is what allowed it to thrive.
0:05:20 So I’d love to hear from both of you on how to balance that perspective.
0:05:21 So I agree with that.
0:05:25 I think though what’s different is we used to be able to compartmentalize technology.
0:05:29 It was a piece of software that you used at work to help you be more productive.
0:05:33 But if technology really is going to be part and parcel of everything, then I think it
0:05:37 changes the nature of how we think about that responsibility because it is regulated industries
0:05:42 in many cases that have been largely immune over time from technology in a way that appears
0:05:43 to be different today.
0:05:45 So I would say that then there are two questions that derive from that.
0:05:48 One is how do you make regulation intelligent?
0:05:53 How do you make it abide by the public interest or enhance the public interest, but at the
0:06:00 same time not dampen that creative and in a way entrepreneurial drive behind the development
0:06:01 of new ideas?
0:06:06 And then secondly, what are the ways that government should be working with those that
0:06:08 are going to be impacted by technology?
0:06:11 If you’re in the car industry, it’s going to be a huge change, right?
0:06:15 I mean, if you get these driverless cars, it’s going to change obviously jobs.
0:06:17 It’s going to change insurance.
0:06:22 It’s going to change the method of production, what you produce, probably change the concept
0:06:23 of car ownership in some way.
0:06:24 Absolutely.
0:06:25 It might even reshape entire cities.
0:06:26 Everything will be impacted by it.
0:06:27 Exactly.
0:06:32 I am fascinated by the potential of technology to allow African nations and governments to
0:06:36 circumvent some of the legacy problems we have within our systems.
0:06:41 And that goes for everything from basic healthcare and education through to how you help agricultural
0:06:45 small holders develop a better yield, cooperate better together, and link up better with the
0:06:46 market.
0:06:49 And in fact, one thing that’s happening in Africa today is there are applications of
0:06:52 tech that are growing up in interesting ways.
0:06:55 So my point is, you’ve got all these different facets.
0:07:00 And yet at the moment, the curious thing is, if you were to go to virtually any European
0:07:06 country or if you were to come here and say, okay, name the top four issues, where would
0:07:09 technology be in that list?
0:07:10 Would it be at the top of the list is what you’re saying?
0:07:12 No, I think it wouldn’t be in the list.
0:07:16 Cooper, you spent a lot of time actually in your role as a partner in front of Congress
0:07:20 and various entities giving testimonies about policy and curious for your take on this.
0:07:21 Yeah.
0:07:25 There is a concern I often hear from entrepreneurs, which is, how do we know if we go there?
0:07:29 How does that not just bring us into the fold of regulation and therefore have negative
0:07:32 consequences versus, you know, we talk about things out here.
0:07:35 Sometimes you do things you ask for permission later is a better strategy.
0:07:36 Right.
0:07:37 Ask for forgiveness.
0:07:38 Ask for permission.
0:07:39 Yeah.
0:07:40 I complete to get that.
0:07:44 That’s why I think it’s got to be a big, it’s got to be done in a big way from the collectivity
0:07:47 rather than individual people going because of course you’re absolutely right what will
0:07:50 happen is that the entrepreneurs think, okay, if I go and say, I’ve got the following five
0:07:53 problems that I can see in this technology I’m developing, they’re going to regulate
0:07:54 it away.
0:07:55 Yeah.
0:08:01 I think the hard question will be you’re getting people and companies that will enormous power.
0:08:06 I mean, not just the big tech players, but the others as well.
0:08:11 So I think one of the things that in a sense, it’s my question to you is, how do you manage
0:08:18 to get into that dialogue with policymakers where, you know, these very powerful people
0:08:23 recognize that in the end, you know, however powerful they are, they are not more important
0:08:25 than the public interest.
0:08:28 Part of we believe our role is to help provide, you know, visibility.
0:08:31 I wouldn’t, I don’t want to say education because I think politicians are very well
0:08:35 educated and certainly well meaning, but to connect the divide between, in our case, DC
0:08:36 and Silicon Valley.
0:08:40 And so we will often reach out to regulators, legislators and help them understand this is
0:08:43 what’s happening from an innovation perspective and therefore these are things that you might
0:08:45 want to anticipate that you need to think about.
0:08:49 So autonomous driving is a great example, right, which you mentioned is in order to
0:08:53 make that work in the United States, we probably need forward-looking governments to say there
0:08:58 are test zones or areas where we might have almost regulatory free zones for testing purposes,
0:09:01 right, that have proper supervision, but to enable something that otherwise might not
0:09:03 exist ahead of its time.
0:09:07 Obviously, you’ve got specific micro issues, I mean, they can be big in their impact like
0:09:11 driverless cars, but there is a specific thing, they’ve got specific issues attached to them.
0:09:18 But where does the tech sector go if it wants to engage on, you know, the bigger macro question
0:09:24 of how do you redesign government, by the way, as well as individual sectors, because
0:09:27 government itself is going to have to change.
0:09:28 That organization doesn’t exist today, at least.
0:09:30 I’m not aware of where you would do that.
0:09:34 And I think the other problem with it is we have to think beyond geographic and national
0:09:39 borders on this stuff, because technology and capital are free-flowing in our society.
0:09:44 You almost need a United Nations or some kind of, you know, type of organization to convene
0:09:45 to have those discussions.
0:09:46 Yeah.
0:09:47 I would say there’s a couple of things, too.
0:09:48 There’s a couple of factors.
0:09:54 One, there are obviously lobbying entities like the NVCA, there’s the Internet Association,
0:09:56 which a lot of major companies are a part of.
0:10:02 Then there is a group of players, like there’s a group of think tanks and a middle layer,
0:10:06 and then the government agencies themselves have been soliciting testimony.
0:10:10 Cooper has actually done testimony on all kinds of topics, from Cypheus to crypto to
0:10:12 various different topics.
0:10:18 But what’s really interesting to me, especially, is there’s organizations like 18F and USDS
0:10:23 in the US government, at least, where you have technologists doing literally rotating
0:10:24 apprenticeships.
0:10:28 It’s like the rotating missions, essentially, where they go for three years and they’re
0:10:30 contributing to actually reinventing government systems.
0:10:32 Now, this is a very important addition about it.
0:10:33 Yes.
0:10:34 I think it is, too.
0:10:36 And what’s really amazing is that it’s got tangible impacts.
0:10:41 So a specific example is we have a huge Veterans Administration that doesn’t get great healthcare.
0:10:46 So they redesigned the VA site in order to make sure that people who have accessibility
0:10:49 issues can use the site in a friendly way.
0:10:51 There’s many more applications of the types of things they’re doing.
0:10:53 We’ve actually had them on this podcast.
0:10:54 But I think those are some avenues.
0:10:56 But to Cooper’s point, there is no single entity.
0:11:01 I will say that at Wired, I edited a big set of op-eds around the ITU, which is sort of
0:11:03 like a UN for the internet.
0:11:06 And it was during the WC-12 hearings, which you might recall.
0:11:11 I think Hama Dune Touré was the head of the commission, and I edited him as well.
0:11:16 And what’s fascinating to me is that there’s a lagging versus a leading approach to it,
0:11:20 because you’re sort of taking the data that’s passed, not really looking forward.
0:11:23 And that was what I saw as a big drawback when I was working with the WC-12 op-ed.
0:11:28 So I’m curious for your take on how do we shift it, so you are listening to those being
0:11:34 affected by technology, but with the point of view that spins it forward for future generations.
0:11:38 Because if we had listened to all the farmers in the first wave of the Industrial Revolution,
0:11:43 we may not have many of the things today, but their grandkids are benefiting from those
0:11:44 things.
0:11:45 Yeah, no, absolutely.
0:11:48 So look, I think there are two caps that I see, and I just look at this from the side
0:11:54 of the, as a we’re ordinary politician, because I think there are initiatives that are happening
0:11:58 inside government where people or departments will get it, and therefore they’ll embrace
0:12:02 it and bring in smart people to help them and so on.
0:12:04 But I think there are two sort of lacunae.
0:12:10 One is your average politician does not understand a lot about this, and that is not sort of
0:12:12 a disrespect to your average politician.
0:12:18 It’s that it’s new, it’s complicated, it takes you time to get your head around it.
0:12:22 My eldest son is in technology, and I am always trying to get him to explain blockchain to
0:12:23 me.
0:12:24 We’re big on crypto.
0:12:25 I know.
0:12:31 I remember you sent me the other day saying, “This is the idiot’s guide to crypto currency,
0:12:33 and I still couldn’t understand it.”
0:12:35 I’m going to send you our crypto cannon.
0:12:36 We took a stab at Compound Water Resources.
0:12:37 Right.
0:12:38 But you probably shouldn’t test me on it.
0:12:39 But that is one lacunae.
0:12:44 Those people have to understand this is like the 19th century Industrial Revolution.
0:12:47 So you’ve got to get your ordinary politicians to understand it.
0:12:51 And then there’s another lacunae which is, I think, in getting the dialogue at the top
0:12:56 level between particularly the Americans and the Europeans, because I also think it would
0:13:00 be immensely helpful if we had a more transatlantic approach.
0:13:01 I think there’s a third piece.
0:13:03 There’s an incentives problem.
0:13:07 I would imagine if you did a survey of most politicians, they would say, “My fundamental
0:13:11 role is how do I improve long-term economic growth and job sustainability for my constituents?”
0:13:15 I mean, if people kind of cut through a lot of the politics, that’s really why they think
0:13:16 they’re there.
0:13:19 Look, they want to make a better life and make a better opportunity for their constituents.
0:13:23 The problem we have, though, is because their short-term incentive program is to get reelected,
0:13:25 which I understand is a good thing from a political perspective.
0:13:29 It’s very hard for them to take that long-term view because the shorter-term opportunity
0:13:34 is to say, “Look, I really need to do no harm to my constituents and by allowing technology,
0:13:39 which might be in the short-term, displacing and unsettling to job growth and other stuff,
0:13:41 particularly for different segments of the population.”
0:13:44 It’s very hard, I would imagine, as a politician to square those two things, which is how do
0:13:49 I help my constituents understand, you know, to Sonal’s point that, yes, over a period
0:13:54 of time, it was a good idea to have industrialized farming as opposed to pure manual agrarian
0:13:55 farming.
0:13:58 But that’s an incredibly unsettling thing, particularly in the U.S. here, if every two
0:14:01 years you have to get reelected or you go find a new job.
0:14:07 By the way, this happened in the 19th century and you had whole new politics created around
0:14:08 it.
0:14:10 And I think there are two things that are important here.
0:14:16 First of all, I think the technology will, in some way, provide solutions to what is
0:14:23 a constant dilemma for an ordinary politician, which is we need to do more for our people,
0:14:27 but we can’t just keep spending more and taxing more.
0:14:31 If the technology can help unlock part of that, that is something they’re prepared to
0:14:32 go for.
0:14:37 And secondly, with most politicians, if they’re able to see this within a longer-term perspective,
0:14:42 what you say to them, “Look, we’ll help you and guide you through this process of change,
0:14:44 but in the end, it’s a beneficial change.”
0:14:47 And what I found when I was in government is some of the most difficult reforms we put
0:14:54 through, for example, around education reform, healthcare reform, we were able, in some ways,
0:15:01 with at least some people, to say this short-term difficulty is going to be worth it.
0:15:03 How did you pull that off, though?
0:15:05 Was it the education, the explanation?
0:15:07 Was it consensus building?
0:15:12 I mean, let me take a very specific example, which, of course, is under attack now, but
0:15:18 we introduced tuition fees in the UK, but my point was very, very simple, that universities
0:15:22 are going to be major engines of economic growth in the future, in particular because
0:15:26 of the link between university and technology and the development of technology.
0:15:31 And therefore, we cannot afford for UK universities not to be able to get the best talent, and
0:15:34 they’re going to have to therefore have an extended funding base.
0:15:35 They can’t get it all from government.
0:15:39 And my point is, if you get it all from government in the end, some governments will start to
0:15:43 slice it away, and you’re always hand-to-mouth as universities.
0:15:49 And I reckon when we did that, it was very difficult, in fact, it was extremely difficult,
0:15:55 but in the end, you were able to say to people, “Look, if we want to save our position as
0:16:00 a country that, along with the US, probably has most high-quality universities in the
0:16:02 top 50 in the world, then we’ve got to be prepared to do that.”
0:16:07 Now, some people, by the way, rejected it, and today it’s a big political issue again,
0:16:12 but you can get to at least some form of alignment between long-term and short-term.
0:16:15 It’s a fundamental rethinking of what the role of government is, quite frankly, right?
0:16:19 Which is, again, if you take the premise that the overall objective for government is to
0:16:23 create economic conditions that hopefully generate long-term economic growth and sustainability
0:16:25 for individuals and companies, then you’re right.
0:16:29 Maybe the ancillary role of government is, how do we deal with short-term issues that
0:16:32 have market dislocations for people?
0:16:35 Maybe that’s a more proper way to describe what the role of government is, in many cases.
0:16:38 I think the other thing would be, I think there’s another question for politics which
0:16:43 would be very challenging, because what would be weird is if the whole of the world is undergoing
0:16:46 this revolution and politics is just kind of staying fixed.
0:16:49 The type of people who go into politics, what happens often is people leave university,
0:16:53 they’re going to become a researcher for an MP, and then they become an MP, and then they
0:16:55 become a minister, but they have no experience of the outside world, right?
0:16:59 So that’s one, and it becomes a constraint over time, and then the types of people who
0:17:00 work in government.
0:17:04 So you were saying something about the people who’ve been brought into, say, the Veterans
0:17:06 Administration here.
0:17:12 So how do you actually open up public service and then get a greater interoperability between
0:17:15 public service and the private sector?
0:17:20 Because all of the sort of pressure certainly coming from the media has been not to allow
0:17:25 that to happen, and not to allow politicians to have anything other than they’re usually
0:17:26 just focused on…
0:17:29 Okay, let’s say we all agree, which I think we do, that there needs to be a connection
0:17:32 between all the entities working together, no question.
0:17:36 More engagement, more explanation, more understanding, thinking of consequence.
0:17:37 I think those are all table stakes.
0:17:41 The question now is, how do you then think about unintended consequences?
0:17:46 Because the story, to me, is not that bad things have bad consequences.
0:17:51 It’s that often the worst consequences come from very well-intended things.
0:17:53 And quite frankly, the perfect example that comes to mind is GDPR.
0:17:57 Yeah, to make it concrete, there’s been a, over the last several months, and I’m sure
0:18:00 probably more so in Europe as well, there’s been a number of articles talking about when
0:18:05 you look at kind of the broad impact of GDPR, essentially, it’s endured largely to the benefit
0:18:08 of the very large incumbents, which was probably not what’s intended to do.
0:18:09 Because they’ve got the resource to better handle.
0:18:10 That’s exactly right.
0:18:14 And the analogy we have here in the States was the Dodd-Frank legislation that came out
0:18:18 of the global financial crisis, where financial institutions had to comply with a whole new
0:18:20 set of regulations.
0:18:24 What it really did here in the U.S. was, it really entrenched those incumbents very well,
0:18:27 and it made it very hard for startup financial institutions to grow.
0:18:31 It was very hard for a new institution to get a banking license for many years, in part
0:18:33 because of the regulatory cost of doing so.
0:18:35 And so, how do you balance that?
0:18:38 And maybe the answer is, look, it’s an education problem, but well-meeting and politicians
0:18:43 certainly expect and intend that regulation is the appropriate way to deal with these things.
0:18:47 It does, in some cases, interfere with the overall goal of entrepreneurship in a startup
0:18:48 formation.
0:18:53 That’s why I think that the attitude of the technology sector to engagement with government
0:18:54 is so important.
0:18:58 Because if you’re engaging with government saying, look, we understand there’s a massive
0:19:01 set of issues here, and we’re really going to sit down and work with you as to how we
0:19:07 get the right answer, then government’s in a position where they regard you as a partner.
0:19:11 But I think for this moment in time, a bit like, actually, the aftermath of the financial
0:19:17 crisis, government kind of regards, you’re looking after yourselves, but we got to look
0:19:19 after the public.
0:19:21 And that’s where it leads to poor regulation.
0:19:28 I mean, poor regulation is nearly always the consequence of a failure on the regulating
0:19:31 side to really understand what’s going on.
0:19:35 And on the founder’s side, what I’m hearing is to really communicate the benefits of the
0:19:36 technology upfront.
0:19:40 And to be so defensive that you’re just thinking all the time, how can we ward these people
0:19:41 off?
0:19:46 But here’s the thing, you can sometimes, if you have wealth, which a lot of these big
0:19:51 tech players do, and power, and you also have access, and you can go and see whoever they
0:19:57 want to see, it can sometimes mask your essential underlying vulnerability.
0:19:58 Interesting.
0:19:59 Right.
0:20:05 And your vulnerability is the comes a point when suddenly the mood flips, and it doesn’t
0:20:10 matter how much wealth and power and access you have, you’re the target.
0:20:12 So that point, everything changes.
0:20:16 So if you want to avoid that, I think it’s got to be a dialogue that’s structured and
0:20:22 it requires not just things happening between the tech sector and government, but for people
0:20:28 like my own institute, to use our sort of convening power, the political side, to say
0:20:30 to the politicians, look, let’s get our heads around this.
0:20:32 Here’s my essential challenge.
0:20:37 How do you take this technological revolution and as a politician, weave it into a narrative
0:20:39 of optimism about the future?
0:20:40 Yes.
0:20:41 I want that too.
0:20:42 Right.
0:20:43 Yeah.
0:20:44 So what’s driving the populism is pessimism.
0:20:46 If people are pessimistic, then they look for someone to blame.
0:20:52 If people are optimistic, they look for something to aspire to, and that’s the essential difference.
0:20:54 It’s really interesting also, Sonal and I have been having this conversation, and we’ve
0:20:58 been having this conversation in the U.S. about ESG, right, which obviously, you know, certainly…
0:20:59 Environmental social governance.
0:21:00 Right.
0:21:02 Which Europe is way ahead of the U.S. there, and Larry Fink, who’s the head of BlackRock
0:21:05 here, has written this letter, you know, appealing to CEOs, and it really goes to the same issue
0:21:09 you’re talking about, which is fundamentally, what is the role of the corporation and how
0:21:12 do corporations think about obviously enhancing value for their shareholders, but also to your
0:21:17 point recognizing that they impact constituents in many other ways, and I think that’s kind
0:21:21 of the dialogue we ought to be having with politicians, which is, look, we can create
0:21:25 a world where it’s compatible to have, you know, maximizing shareholder opportunity,
0:21:28 but also recognizing and being a part of the broader community discussion about the impact
0:21:29 on society.
0:21:35 The other thing is to recognize that when we create these things, we have some obligation
0:21:36 to share.
0:21:40 It comes out of fundamental macroeconomics, right, which is we can improve growth for
0:21:44 a country by either population growth and/or productivity growth, right?
0:21:47 Those are the two levers in theory that we can impact, and if we can frame the discussion
0:21:50 around technology, that’s a lot of where the U.S. has done well, right?
0:21:54 We’ve generally, obviously times are changing, but we’ve generally been very open to immigration
0:21:58 and thought about population growth as a way to help improve the lot for people generally,
0:22:01 and we’ve also been very open to productivity growth, right, in the form of technology and
0:22:05 automation, and if we can frame it that way, but also to your point, recognize that there
0:22:09 are going to be disintermediations along the way, and part of our responsibility is to help
0:22:14 from a training and education perspective, and even potentially the role of government
0:22:19 in subsidizing the transition from less automated to more automated society.
0:22:21 What happens to education in all of this?
0:22:23 I don’t think we have a singular point of view on it.
0:22:27 We have talked about education a lot on this podcast and shared a diversity of views, but
0:22:32 I think a couple of the high level things are that universities are huge drivers, of course,
0:22:36 as you mentioned, of innovation, and in every study of regional innovation, every innovation
0:22:43 cluster is successful because of the collaboration between universities, government, local, entrepreneurial
0:22:44 communities.
0:22:47 The other key point, however, is it’s a combination of top-down and bottom-up.
0:22:52 People who have tried top-down, industrial, planned, smart cities or things like Silicon
0:22:53 Valley never work.
0:22:56 The only bottom-up ones alone don’t necessarily work.
0:22:57 You need a combination of the two.
0:23:00 That’s the number one finding, but the second thing, and this is a big topic we talk about
0:23:05 on this podcast, is the importance of apprenticeship and a new type of education that really thinks
0:23:07 about skills-based education.
0:23:12 We have this elitist attitude that education has to be a certain way when, in fact, in
0:23:16 this day and age, especially with increased automation and the need for jobs, we might
0:23:19 want to be really thinking about very specific skills-based education.
0:23:24 It’s actually fascinating because, in fact, my eldest son’s got a company that’s on apprenticeships
0:23:28 and technology, so that’s exactly what he does.
0:23:32 I think it’s really, really interesting because of the idea that you don’t necessarily have
0:23:33 to go to university.
0:23:38 Well, there are alternative universities coming about too, like we’re investors in
0:23:39 Udacity.
0:23:40 There’s just Lambda School.
0:23:44 There’s all these interesting types of containers where people can get what they call nano-degrees
0:23:46 or microskills or specific skills.
0:23:50 There’s so much that’s actually in play, because the point I want to raise here, this is kind
0:23:55 of an underlying theme to me, is that technology, as you pointed out, you can take an optimistic
0:23:56 view.
0:23:59 It also gives you the means to address many of the problems that we are complaining about
0:24:04 because when I think of some of the trade-offs between waiting for a government to update
0:24:12 policy, what I love is that a mass of users on a platform can essentially vote with saying
0:24:18 leave that platform, and immediately that platform is going to act the next day in a
0:24:20 way that a lawmaker cannot overnight.
0:24:22 Yes, from a political perspective.
0:24:24 You want this thing at least to have some sort of rational-
0:24:25 Of course.
0:24:26 It shouldn’t be mobbed.
0:24:31 But I think the other thing is, if you take, so a lot of what drove, for example, Brexit
0:24:35 in the UK is, apart from the immigration issue, was this idea of communities, people left
0:24:36 behind.
0:24:41 So what is it that technology would do to go into those communities and help people gain
0:24:45 better education, get connectivity to the world, because in the end, this is what it’s
0:24:46 all about.
0:24:47 If you’re not connected, you’re not really–
0:24:48 If you’re left behind.
0:24:49 Right.
0:24:55 So I think one big question is, how does the technology sector help us as policymakers
0:25:00 reach those people for whom the conversation we’ve just been having may be sort of scratching
0:25:02 their heads and thinking about what these guys are on.
0:25:03 That’s a fantastic question.
0:25:05 And actually, it’s interesting because we’re investors in NationBuilder, which is one of
0:25:10 the companies that mobilize a lot of the communities that actually organize for pro and for con
0:25:11 around these things.
0:25:14 So a quick thing, I do want to make sure we actually give answers because we’re asking
0:25:15 a lot of questions.
0:25:19 So can you both give a little bit more on what concretely we need to do?
0:25:23 So from the point of view of my institute, what we’re doing is we’re creating a tech
0:25:25 and public policy center.
0:25:29 And the idea is to bring a certain number of people who really understand the tech side
0:25:32 and a certain number of people who come from the public policy side, put them together
0:25:33 in a structured way.
0:25:35 I will kind of curate that.
0:25:43 And out of it should come what I call serviceable policy and a narrative about the future,
0:25:46 which makes sense of this technological revolution.
0:25:50 And then to link up with politicians, not just in the UK and Europe, but actually over
0:25:55 here and create a sense that this technological revolution should be at the center of the
0:25:56 political debate.
0:25:57 How do we handle it?
0:26:00 How do we, as I say, mitigate its risks and access its opportunities?
0:26:03 So that’s one very specific thing.
0:26:07 And then I think the other thing, frankly, is just to be out there, myself and a number
0:26:13 of other people, at least of access to the airwaves, to say, guys, we’ve got to switch
0:26:14 the conversation.
0:26:18 You’ve got to put this technology question at the heart of the political debate.
0:26:22 Now the solution, some people may go to the left, some people may go to the right.
0:26:24 Some people will never be in between.
0:26:25 But make it the conversation.
0:26:29 Put it at the top four of those priorities for every country, every organization.
0:26:30 I think that’s right.
0:26:32 Fundamentally, we’re talking around the issues.
0:26:36 It’s either immigration or its income inequality or other things that drive the debate.
0:26:40 But the fundamental question is exactly that, which is how do we move forward with broader
0:26:41 economic growth initiatives?
0:26:46 So sitting here in Silicon Valley, any individual company is probably better off, quite frankly,
0:26:49 taking the break glass first and then ask for forgiveness later.
0:26:53 And so it’s, I think, the idea of having kind of solving that collective action problem
0:26:56 through a convening organization makes a lot of sense.
0:26:59 But you come to the issues of very traditional income inequality.
0:27:03 Now there is a perfectly good question as to whether you raise the minimum wage, and
0:27:04 if so, by how much?
0:27:07 And my government was coming to introduce the minimum wage in the UK, so I’m very familiar
0:27:10 with all those arguments.
0:27:17 But in the end, there is a whole other dimension to that individual, which is about the world
0:27:20 that’s changing and their place in it, and whether they’re going to have the skills and
0:27:21 the aptitude to be able to.
0:27:25 So you’re just saying completely at every level reframe that technology is at the center
0:27:26 of that.
0:27:27 Right.
0:27:31 So it’s not that you displace traditional questions of taxation and inequality, but the
0:27:38 truth of the matter is it’s going to be probably in the long term more significant for that
0:27:43 individual and for the society if this technological revolution is handled properly.
0:27:47 So if you had a debate in the UK at the moment about our healthcare system, our national
0:27:53 health service, it would be, should we spend 10 billion pounds a year more on it or 5 billion?
0:27:59 But how do you change the whole of the way we implement care for people because of technology
0:28:00 is going to have a much bigger impact?
0:28:01 I agree.
0:28:05 I guess the only thing I would add to this, because I think about this a lot, interestingly,
0:28:10 is that we treat technology like this word, this homogenous, nebulous entity.
0:28:16 And the reality is that every single instance so depends on the specific technology.
0:28:19 So my call to action, I guess, would be to think about it very specifically.
0:28:25 The way we think about AI, that’s such a broad phrase and it’s a very scary phrase that
0:28:30 suggests everything from generalized intelligence to very specific automation that gets your
0:28:33 bank account updated automatically.
0:28:34 So I think there’s two things to this.
0:28:38 One that we need to be incredibly specific about what technology we’re talking about,
0:28:39 in what context.
0:28:43 And then B, we also dial in the right degree to what we’re talking about at what point
0:28:44 because I don’t really make a difference.
0:28:45 I completely agree with that.
0:28:50 I mean, I think the only thing I would say is right now we’re actually far away from
0:28:53 even getting the specifics.
0:28:54 Yeah, you’ve got a good point.
0:28:55 That’s fair.
0:28:59 You know, I was just saying to people in politics when we were campaigning and they’d say, you
0:29:03 know, I’d say, right, because we were, we campaigned in the slogan 99% right, new labor,
0:29:04 new Britain.
0:29:05 Right.
0:29:06 And they say, no, but it’s much more complicated now.
0:29:09 I say, okay, guys, it is, but actually it’s complicated, but sometimes you need to go
0:29:10 straight.
0:29:11 I hear you.
0:29:15 You’re saying that you’re saying that when you go back to your old constituency in constituency
0:29:19 in Northern England, they don’t care about the specifics, they just need to have their
0:29:20 fears established.
0:29:22 The first thing that you need to persuade them of, you’ve got to say to them, guys, technology
0:29:24 is going to change the world and we’ve got to prepare for it.
0:29:25 They’re not there yet.
0:29:29 When you get them there, then obviously in all sorts of different ways, but this is where
0:29:35 I think that the gulf that there is between the technology sector and the policies and
0:29:36 therefore the people is so big.
0:29:40 How do we deal with the fundamental challenges that we have as we talked about earlier from
0:29:44 an incentive perspective and short tenures on, you know, office and people’s ability
0:29:45 to be in office?
0:29:48 Is that a conversation you think that the country and the nations are prepared to have?
0:29:50 I think so, but it’s a very good question.
0:29:55 I would also say that in all of the change that’s going to happen, I mean, this is a
0:29:58 whole topic for another podcast, probably with the different people.
0:30:08 But how you exchange information and the validation of that information is an essential part of
0:30:12 having a democratic debate and that is a big problem in today’s world.
0:30:19 So I think it is possible to have that conversation with people, but all political conversations
0:30:25 today are extremely difficult because they happen in such a fevered environment with
0:30:31 so much polarization and the interaction between conventional and social media makes a rational
0:30:33 debate occasionally extremely difficult.
0:30:35 With that qualification I answer this.
0:30:39 I think the better way to approach that problem is to say, how do we make the U.S. and/or Europe
0:30:45 or other places attractive to entrepreneurship and encourage people to think about the regulatory
0:30:49 framework and the economic framework as, you know, want to be participants in these markets
0:30:53 as opposed to the anti, you know, kind of policies we have, which is let’s make it harder for
0:30:56 free flow of capital and try to stave off those opportunities.
0:31:00 It used to be that 90% of venture capital and entrepreneurship happen in the U.S. literally
0:31:05 almost as early as 20 years ago, and if you look at those numbers today, it’s about 50%.
0:31:09 And so the amount of kind of capital that’s kind of been distributed globally and therefore
0:31:12 the amount of opportunity set distributed globally is interesting.
0:31:15 We have to think about this beyond kind of regional borders.
0:31:18 We will have talent in people that are free flowing across geographies.
0:31:21 And so we have to think about this from a broader, you know, global initiative.
0:31:25 Well, you guys, I just want to say thank you for joining the A6NZ podcast.
0:31:26 Thank you.
El proceso está meticulosamente diseñado para el secreto y la escala. Para los Papeles de Panamá, más de 400 periodistas trabajaron en silencio durante un año dentro de una plataforma en línea segura y personalizada, analizando 11.5 millones de documentos. La regla estricta era el intercambio total de datos dentro del consorcio y una fecha de publicación global sincronizada, lo que maximizó el impacto e impidió que competidores fragmentaran la historia. Este enfoque condujo a renuncias inmediatas, investigaciones criminales y la recuperación de cientos de millones en impuestos atrasados. El trabajo va más allá de una sola filtración; el ICIJ añade continuamente a una base de datos pública y consultable de entidades offshore, transformando una investigación puntual en un recurso público continuo para periodistas, reguladores y bancos.
Sustentando estas investigaciones hay una adopción estratégica de tecnología no nativa del periodismo. El equipo readapta software de diversos campos —utilizando una plataforma de citas para su sala de redacción virtual segura, empleando herramientas de análisis de datos de grado gubernamental para escanear documentos, y aplicando tecnología de bases de datos de grafos para mapear conexiones ocultas entre personas y empresas pantalla. Este conjunto tecnológico permite a un equipo pequeño gestionar conjuntos de datos tan vastos que serían impenetrables para los métodos tradicionales de reporteo. Sin embargo, este poderoso trabajo conlleva riesgos significativos, incluyendo amenazas físicas a periodistas y presión constante de poderosos sujetos que buscan desacreditar o anticiparse a los reportajes.
La discusión enmarca este trabajo dentro de una crisis más amplia en el periodismo, donde la evaporación de los ingresos publicitarios tradicionales ha diezmado los equipos investigativos. El modelo de consorcio del ICIJ se presenta como una vía viable hacia adelante, creando economías de escala y haciendo factible nuevamente un periodismo de rendición de cuentas de alto riesgo. Enfatiza que la misión central —exponer la corrupción sistémica y el secretismo— es más vital que nunca, requiriendo nuevos métodos, estándares éticos inquebrantables en torno al interés público, y un alejamiento deliberado del reporteo aislado y competitivo hacia una colaboración poderosa.
### Hallazgos Sorprendentes
– **Sistema de Trueque Periodístico:** El modelo de “negocio” central del ICIJ implica intercambiar una historia o conjunto de datos importante con medios socios a cambio del tiempo y recursos de sus reporteros, en lugar de pagar a freelance o vender exclusivas.
– **Tecnología Readaptada para el Secreto:** La sala de redacción virtual segura que permitió a cientos de periodistas colaborar globalmente sin filtraciones fue construida sobre una plataforma originalmente diseñada para sitios web de citas.
– **El Rol de “Árbitro Neutral”:** Una clave para el éxito del consorcio es que el ICIJ actúa como árbitro final del momento de publicación, evitando que los medios socios, por competitividad, se apresuren a publicar y fragmenten el impacto de la historia.
– **Base de Datos Pública como Legado:** Las filtraciones no son solo para historias únicas; el ICIJ extrae datos personales sensibles y mantiene una base de datos pública y consultable de entidades offshore que continúa generando nuevas historias y es utilizada por gobiernos y bancos.
– **Publicidad Previo por Ataques:** Cuando el Kremlin denunció al ICIJ como una operación de la CIA una semana *antes* del lanzamiento de los Papeles de Panamá, Ryle lo vio no como un desastre, sino como publicidad previa gratuita y global que generó anticipación en la audiencia.
### Conclusiones Prácticas
– **Aplicar la Prueba de las Tres Preguntas:** Para evaluar si una historia vale la pena, pregúntese: 1) ¿Es de genuino interés público (afecta a muchos), no solo una queja privada? 2) ¿Revela una falla o patrón sistémico? 3) ¿Publicarla puede llevar realísticamente a un cambio o resultado?
– **Colaborar para Impactar:** Especialmente para periodistas locales, considere aliarse con otros medios (radio, TV, digital) en investigaciones compartidas para agrupar recursos, compartir riesgos y coordinar la publicación para un impacto mucho mayor.
– **Aprovechar la Tecnología para Hallar Patrones:** Vaya más allá de la revisión manual de documentos. Use herramientas disponibles de análisis y visualización de datos para procesar grandes conjuntos de información, encontrar conexiones ocultas e identificar la historia dentro de los datos.
– **Construir un Protocolo de Comunicación Seguro:** Asuma que las comunicaciones digitales están monitorizadas. Use aplicaciones y correo encriptados (como Signal, Wire o PGP) como práctica estándar, especialmente al tratar con fuentes o datos sensibles.
– **Enfocarse en el “Secretismo” en Sí:** En investigaciones que involucran estructuras financieras o corporativas complejas, siga el rastro hasta donde la información está deliberadamente oscurecida. El acto de ocultar información es a menudo donde yace la historia central de interés público.
Una única y vívida lágrima surca la mejilla de bronce del gigante caído Talos mientras se estrella contra la tierra, derrotado no por la fuerza bruta sino por su propio anhelo de inmortalidad. Esta desgarradora imagen de una antigua vasija griega captura una tensión central que explora la historiadora Adrienne Mayer: durante milenios, los humanos hemos estado fascinados por crear vida artificial, pero no podemos evitar sentir empatía por nuestras creaciones, incluso cuando están diseñadas como instrumentos insensibles de poder o protección. La investigación de Mayer indaga en las profundas raíces de nuestra imaginación tecnológica, rastreando conceptos de robots, IA y autómatas hasta los mitos e inventos históricos de la antigüedad. Estas historias revelan que nuestras ansiedades y aspiraciones actuales en torno a la tecnología no son nada nuevas, sino una parte fundamental de la historia humana.
La conversación se centra en el concepto griego antiguo de la biotechnē—”vida a través de la artesanía”—que describe entidades que fueron “hechas, no nacidas”. Esta distinción crítica separaba lo natural de lo artificial. El dios Hefesto sirvió como el arquetipo divino del inventor, fabricando de todo, desde sirvientes dorados autónomos e inteligentes (visiones tempranas de la IA) hasta el robot guardián Talos y la peligrosa Pandora. Estos mitos no eran mera fantasía, sino “sueños culturales” o “experimentos mentales” que abordaban las implicaciones de la tecnología mucho antes de que existiera la ingeniería para construirla. Lidiaban con cuestiones de mando y control, sensibilidad y el uso ético del poder automatizado, mostrando a menudo cómo estos inventos eran desplegados o mal utilizados por tiranos y dioses.
Pasando del mito a la realidad histórica, Mayer descubre un mundo donde ingenieros antiguos, a menudo al servicio de autócratas, construyeron autómatas de una complejidad asombrosa. Estos iban desde una paloma voladora a vapor creada por un amigo de Platón hasta una aterradora reina robótica tachonada de clavos utilizada para la tortura en Esparta, y enormes estatuas auto-móviles que derramaban libaciones en grandes desfiles para Ptolomeo II. Este registro histórico demuestra que el impulso por automatizar y animar fue perseguido activamente, fusionando espectáculo, poder e innovación. El hilo conductor desde el mito antiguo hasta la invención en el mundo real muestra que el impulso por aumentar la capacidad humana y jugar a ser creador es una constante en la civilización.
En última instancia, estas narrativas antiguas proporcionan una poderosa lente para examinar la tecnología moderna. Enmarcan dilemas perdurables: ¿se usarán nuestros avances para una protección benévola o para un control tiránico? ¿Podemos mantener la previsión de Prometeo, o, como su hermano miope Epimeteo, nos dejaremos deslumbrar por las ganancias a corto plazo? Los mitos sugieren que nuestra relación con la tecnología está inextricablemente ligada a la naturaleza humana: nuestras ambiciones, nuestros miedos y nuestra persistente tendencia a humanizar las máquinas que construimos. Nos recuerdan que, aunque nuestras herramientas han avanzado exponencialmente, las preguntas centrales sobre responsabilidad, ética y el deseo de “superar a la naturaleza” son ciertamente antiguas.
Perspectivas Sorprendentes
- La Utopía Automatizada de Aristóteles: El filósofo Aristóteles, en un fugaz experimento mental, reflexionó que si existieran autómatas como telares auto-tejedores o arpas auto-tocadas, no haría falta la esclavitud: un vínculo sorprendentemente temprano entre la automatización y la liberación social.
- La Hackers Original Era una Hechicera: En el mito de Jasón y los Argonautas, Medea derrota a un ejército imparable de autómatas no con magia, sino comprendiendo y explotando su programación: lanzando una piedra para que se ataquen entre sí. Mayer la identifica como una proto-hacker.
- Espectáculos Antiguos del “Valle Inquietante”: Relatos históricos describen grandes desfiles con estatuas gigantes y auto-móviles que se levantaban, vertían leche y se sentaban suavemente. Estas exhibiciones públicas probablemente provocaban la misma mezcla de asombro e inquietud (el efecto del “valle inquietante”) que los robots realistas de hoy.
- Un Legado Militar: El actual proyecto del Departamento de Defensa de EE.UU. para construir un exoesqueleto automatizado para soldados se llama TALOS, vinculándose directamente con el antiguo guardián robótico de bronce, lo que ilustra cómo estos arquetipos míticos siguen inspirando la tecnología militar moderna.
- Pandora Era una “Fembot” con una Misión: En su versión original, el mito de Pandora trataba sobre una sofisticada mujer artificial, creada por Hefesto y programada con una única y devastadora tarea: infiltrarse en la sociedad humana y liberar la miseria de su jarra.
Conclusiones Prácticas
- Cultivar la Previsión “Prometeica”: Al evaluar nuevas tecnologías, esfuérzate conscientemente por la previsión de Prometeo en lugar de la miopía de Epimeteo. Busca activamente ver más allá de los beneficios inmediatos hacia las posibles consecuencias y usos indebidos a largo plazo.
- Examinar las Dinámicas de Poder: Sé escéptico con la tecnología desarrollada únicamente para, o controlada exclusivamente por, un poder centralizado. Los mitos antiguos muestran consistentemente que los autómatas amplifican el poder de los tiranos; esto sirve como un modelo de precaución para escrutinar quién controla la IA avanzada y la robótica actual.
- Anticipar la Respuesta Humana: Reconoce que las personas inevitablemente proyectarán emociones y sensibilidad en robots y IA avanzados. Diseña teniendo en cuenta esta tendencia humana, considerando las implicaciones éticas de crear entidades que despierten nuestra empatía.
- Aprender del Principio de la “Biotechnē”: Utiliza la antigua distinción “hecho, no nacido” para aclarar debates éticos. Ayuda a separar las discusiones sobre la vida biológica de las de los artefactos ingenieriles, enfocando las preguntas en el diseño, propósito y responsabilidad.
Quando jornalistas de 76 países publicam a mesma reportagem explosiva simultaneamente, isso cria uma onda de choque política capaz de derrubar primeiros-ministros e forçar um acerto de contas global. Este é o formidável modelo colaborativo pioneiro do International Consortium of Investigative Journalists (ICIJ), conforme explicado pelo seu diretor Gerard Ryle. A conversa mergulha na mecânica do jornalismo investigativo moderno, revelando como uma pequena organização sem fins lucrativos orquestra furos globais como os Panamá Papers e Paradise Papers ao atuar como um hub neutro para organizações de mídia em todo o mundo. O modelo é uma resposta criativa ao declínio financeiro das redações tradicionais, aproveitando parcerias nas quais os veículos de mídia contribuem com repórteres em troca de acesso a grandes vazamentos, garantindo tanto impacto generalizado quanto risco compartilhado.
O processo é meticulosamente projetado para sigilo e escala. Para os Panamá Papers, mais de 400 jornalistas trabalharam em silêncio por um ano dentro de uma plataforma online segura e sob medida, analisando 11,5 milhões de documentos. A regra estrita era o compartilhamento total de dados dentro do consórcio e uma data de publicação global sincronizada, o que maximizou o impacto e impediu que concorrentes fragmentassem a história. Essa abordagem levou a renúncias imediatas, investigações criminais e à recuperação de centenas de milhões em impostos atrasados. O trabalho vai além de um único vazamento; o ICIJ adiciona continuamente a um banco de dados público e pesquisável de entidades offshore, transformando uma investigação pontual em um recurso público contínuo para jornalistas, reguladores e bancos.
Subjacente a essas investigações está uma adoção estratégica de tecnologia não nativa do jornalismo. A equipe reaproveita softwares de diversas áreas — usando uma plataforma de site de namoro para sua redação virtual segura, empregando ferramentas de análise de dados de nível governamental para escanear documentos e aplicando tecnologia de banco de dados de grafos para mapear conexões ocultas entre pessoas e empresas fantasma. Essa “pilha tecnológica” permite que uma pequena equipe gerente conjuntos de dados tão vastos que seriam impenetráveis para os métodos tradicionais de reportagem. No entanto, esse trabalho poderoso carrega riscos significativos, incluindo ameaças físicas aos jornalistas e pressão constante de poderosos sujeitos que buscam desacreditar ou antecipar as reportagens.
A discussão enquadra esse trabalho dentro de uma crise mais ampla no jornalismo, onde a evaporação da receita tradicional de publicidade dizimou as equipes investigativas. O modelo de consórcio do ICIJ é apresentado como um caminho viável a seguir, criando economias de escala e tornando novamente viável o jornalismo de responsabilização de alto risco. Ele enfatiza que a missão central — expor a corrupção sistêmica e o sigilo — é mais vital do que nunca, exigindo novos métodos, padrões éticos inabaláveis em torno do interesse público e um movimento deliberado para longe da reportagem isolada e competitiva em direção a uma colaboração poderosa.
Insights Surpreendentes
- Sistema de Escambo Jornalístico: O “modelo de negócios” central do ICIJ envolve trocar uma grande história ou conjunto de dados com parceiros de mídia em troca do tempo e dos recursos de seus repórteres, em vez de pagar freelancers ou vender exclusivas.
- Tecnologia Reaproveitada para Sigilo: A redação virtual segura que permitiu que centenas de jornalistas colaborassem globalmente sem vazamentos foi construída sobre uma plataforma originalmente projetada para sites de namoro.
- O Papel de “Árbitro Neutro”: Um fator-chave para o sucesso do consórcio é o ICIJ atuando como árbitro final sobre o momento da publicação, impedindo que parceiros de mídia competitivos se apressem a publicar e fragmentem o impacto da história.
- Banco de Dados Público como Legado: Os vazamentos não são apenas para histórias únicas; o ICIJ remove dados pessoais sensíveis e mantém um banco de dados público e pesquisável de entidades offshore que continua a gerar novas histórias e é usado por governos e bancos.
- Pré-publicidade Ataques: Quando o Kremlin denunciou o ICIJ como uma operação da CIA uma semana antes do lançamento dos Panamá Papers, Ryle viu isso não como um desastre, mas como uma pré-publicidade global gratuita que aumentou a expectativa do público.
Conclusões Práticas
- Aplique o Teste da História de Três Perguntas: Para avaliar se uma história vale a pena ser investigada, pergunte: 1) É um genuíno interesse público (afeta muitos), não apenas uma queixa privada? 2) Ela revela uma falha ou padrão sistêmico? 3) Publicá-la pode realisticamente levar a uma mudança ou resultado?
- Colabore para Impacto: Especialmente para jornalistas locais, considere fazer parceria com outras redações (rádio, TV, digital) em investigações compartilhadas para unir recursos, dividir riscos e coordenar a publicação para um impacto muito maior.
- Use a Tecnologia para Encontrar Padrões: Vá além da revisão manual de documentos. Use ferramentas disponíveis de análise e visualização de dados para processar grandes conjuntos de informações, encontrar conexões ocultas e identificar a história dentro dos dados.
- Crie um Protocolo de Comunicação Seguro: Presuma que as comunicações digitais são monitoradas. Use aplicativos de mensagens e e-mail criptografados (como Signal, Wire ou PGP) como prática padrão, especialmente ao lidar com fontes ou dados sensíveis.
- Foque no “Sigilo” em Si: Em investigações envolvendo estruturas financeiras ou corporativas complexas, siga o rastro até onde a informação é deliberadamente obscurecida. O ato de esconder informações é muitas vezes onde reside a história central de interesse público.
Uma única lágrima vívida escorre pela face bronzeada do gigante caído Talos enquanto ele despenca na terra, derrotado não pela força bruta, mas por seu próprio anseio pela imortalidade. Esta imagem comovente, retratada numa antiga pintura em vaso grego, captura uma tensão central explorada pela historiadora Adrienne Mayer: durante milênios, os humanos foram fascinados pela criação de vida artificial, mas não conseguimos deixar de sentir empatia por nossas criações, mesmo quando foram projetadas como instrumentos insensíveis de poder ou proteção. A pesquisa de Mayer mergulha nas raízes profundas de nossa imaginação tecnológica, rastreando os conceitos de robôs, IA e autômatos até os mitos e invenções históricas da antiguidade. Essas histórias revelam que nossas ansiedades e aspirações atuais em relação à tecnologia não são nada novas, mas são, na verdade, uma parte fundamental da história humana.
A conversa centra-se no conceito grego antigo de biotechnē — “vida através da técnica” — que descreve entidades que eram “feitas, não nascidas”. Essa distinção crucial separava o natural do artificial. O deus Hefesto servia como o arquétipo divino do inventor, fabricando tudo, desde servos dourados autônomos e inteligentes (visões iniciais da IA) até o robô guardião Talos e a perigosa Pandora. Esses mitos não eram meras fantasias, mas “sonhos culturais” ou “experimentos mentais” que lidavam com as implicações da tecnologia muito antes de a engenharia existir para construí-la. Eles debatiam questões de comando e controle, senciência e o uso ético do poder automatizado, muitas vezes mostrando essas invenções sendo implantadas ou usadas indevidamente por tiranos e deuses.
Passando do mito à realidade histórica, Mayer descobre um mundo onde engenheiros antigos, muitas vezes a serviço de autocratas, construíam autômatos surpreendentemente complexos. Estes variavam desde uma pomba voadora movida a vapor criada por um amigo de Platão até uma assustadora rainha robótica cravejada de pregos usada para tortura em Esparta, e imensas estátuas automoventes que derramavam libações em grandes desfiles para Ptolemeu II. Este registro histórico prova que o impulso para automatizar e animar era ativamente perseguido, misturando espetáculo, poder e inovação. A linha direta do mito antigo à invenção do mundo real demonstra que o impulso de aumentar a capacidade humana e brincar de criador é um fio persistente na civilização.
Por fim, essas narrativas antigas fornecem uma lente poderosa para examinar a tecnologia moderna. Elas enquadram dilemas duradouros: nossos avanços serão usados para proteção benevolente ou controle tirânico? Podemos manter a previsão como Prometeu, ou seremos, como seu irmão imprudente Epimeteu, deslumbrados por ganhos de curto prazo? Os mitos sugerem que nossa relação com a tecnologia está inextricavelmente ligada à natureza humana — nossas ambições, nossos medos e nossa tendência duradoura de humanizar as máquinas que construímos. Eles nos lembram que, embora nossas ferramentas tenham avançado exponencialmente, as questões centrais de responsabilidade, ética e o desejo de “superar a natureza” são realmente antigas.
Insights Surpreendentes
- A Utopia Automatizada de Aristóteles: O filósofo Aristóteles, num fugaz experimento mental, refletiu que se autômatos como teares automáticos ou harpas autotocantes existissem, não haveria necessidade de escravidão — um vínculo surpreendentemente precoce entre automação e libertação social.
- A Hackerss Original Era uma Feiticeira: No mito de Jasão e os Argonautas, Medéia derrota um exército imbatível de autômatos não com magia, mas entendendo e explorando sua programação — atirando uma pedra para fazê-los atacar uns aos outros. Mayer a identifica como uma proto-hacker.
- Espectáculos Antigos do “Vale da Estranheza”: Relatos históricos descrevem desfiles gigantescos com estátuas imensas e automoventes que se levantavam, derramavam leite e se sentavam suavemente. Essas exibições públicas provavelmente provocavam a mesma mistura de admiração e inquietação (o efeito “vale da estranheza”) que robôs realistas provocam hoje.
- Um Legado Militar: O projeto atual do Departamento de Defesa dos EUA para construir um exoesqueleto automatizado para soldados se chama TALOS, ligando-se diretamente ao antigo guardião robótico de bronze, ilustrando como esses arquétipos míticos continuam a inspirar a tecnologia militar moderna.
- Pandora Era uma “Fembot” com uma Missão: Em sua narrativa original, o mito de Pandora tratava de uma mulher artificial sofisticada, criada por Hefesto e programada com uma única tarefa devastadora: infiltrar-se na sociedade humana e libertar a miséria de seu pote.
Lições Práticas
- Cultive a Previsão “Prometeica”: Ao avaliar novas tecnologias, esforce-se conscientemente pela previsão de Prometeu, e não pelo retrospecto de Epimeteu. Olhe ativamente além dos benefícios imediatos para potenciais consequências e usos indevidos de longo prazo.
- Examine as Dinâmicas de Poder: Seja cético em relação à tecnologia desenvolvida exclusivamente para, ou controlada exclusivamente por, poderes centralizados. Os mitos antigos mostram consistentemente que os autômatos amplificam o poder dos tiranos; isso serve como um modelo de precaução para examinar quem controla a IA avançada e a robótica hoje.
- Antecipe a Resposta Humana: Reconheça que as pessoas inevitavelmente projetarão emoções e senciência em robôs e IAs avançados. Projete com essa tendência humana em mente, considerando as implicações éticas de criar entidades que despertam nossa empatia.
- Aprenda com o Princípio da “Biotechnē”: Use a antiga distinção “feito, não nascido” para esclarecer debates éticos. Ela ajuda a separar discussões sobre vida biológica daquelas sobre artefatos projetados, focando as questões no design, propósito e responsabilidade.
- Busque a “Substância Beneficial”: Inspirado por um mito recém-decodificado em que Hefesto imprime em um autômato “substâncias benéficas para a humanidade”, use isso como um princípio orientador para inovação. Pergunte-se ativamente, desde o início, como o funcionamento central de uma tecnologia pode ser direcionado para o benefício humano.
with Tony Blair (@InstituteGC), Scott Kupor (@skupor), and Sonal Chokshi (@smc90)
If the current pace of tech change is the 21st-century equivalent of the 19th-century Industrial Revolution — with its tremendous economic growth and lifestyle change — it means that even though it’s fundamentally empowering and enabling, there’s also lots of fears and misconceptions as well. That’s why, argues former U.K. prime minister Tony Blair (who now has an eponymous Institute for Global Change), we need to make sure that the changemakers — i.e., technologists, entrepreneurs, and quite frankly, any company that wields power — are in a structured dialogue with politicians. After all, the politician’s task, observes Blair, is “to be able to articulate to the people those changes and fit them into a policy framework that makes sense”.
The concern is that if politicians don’t understand new technologies, then ”they’ll fear it; and if they fear it, they’ll try and stop it” — and that’s how we end up with pessimism and bad policy. Yet bad regulations often come from even the very best of intentions: Take for example the case of Dodd-Frank in the U.S., or more recently, GDPR in Europe — which, ironically (but not surprisingly) served to entrench incumbent and large company interests over those of small-and-medium-sized businesses and startups. And would we have ever had the world wide web today if we hadn’t had an environment of so-called ”permissionless innovation”, where government didn’t decide up front how to regulate the internet? Could companies instead be more inclusive of stakeholders, not just shareholders, with better ESG (environment, social, governance)? Finally, how do we ensure a spirit of optimism and focusing on leading vs. lagging indicators about the future, while still being sensitive to short-term displacements, as with farmers during the Industrial Revolution?
This hallway-style style episode of the a16z Podcast features Blair in conversation with Sonal Chokshi and a16z managing partner Scott Kupor — who has a new book, just out, on Secrets of Sand Hill Road: Venture Capital and How to Get It, and also often engages with government legislators on behalf of startups. They delve into mindsets for engaging policymakers; touch briefly on topics such as autonomous cars, crypto, and education; and consider the question of how government itself and politicians too will need to change. One thing’s for sure: The discussion today is global, beyond both sides of the Atlantic, given the flow of capital, people, and ideas across borders. So how do we make sure globalization works for the many… and not just for the few.
image credit: Benedict Macon-Cooney
The views expressed here are those of the individual personnel quoted and are not the views of a16z or its affiliates. This content is provided for informational purposes only, and should not be relied upon as legal, business, investment, or tax advice. Furthermore, this content is not directed at nor intended for use by any investors or prospective investors and may not under any circumstances be relied upon when making a decision to invest in any a16z funds. PLEASE SEE MORE HERE: https://a16z.com/disclosures/

Leave a Reply
You must be logged in to post a comment.