a16z Podcast: Fintech for Startups and Incumbents

AI transcript
0:00:02 – Hi, this is Frank Chen.
0:00:04 Welcome to the A16Z podcast.
0:00:07 Today’s episode is titled Three Ways Startups Are Coming
0:00:11 for Established Fintech Companies and What to Do About It.
0:00:13 It originated as a YouTube video.
0:00:18 You can watch all of our videos at youtube.com/a16zvideos.
0:00:19 Hope you enjoy.
0:00:22 – Well, hi, welcome to the A16Z YouTube channel.
0:00:24 I’m Frank Chen, and today I am here
0:00:27 with one of our general partners, Alex Rampel.
0:00:29 I’m super excited that Alex is here.
0:00:32 So first fact, we both have sons named Cameron.
0:00:33 – We do.
0:00:34 – So affinity there.
0:00:36 And then two, one of the things
0:00:38 that I really appreciate about Alex,
0:00:39 and you can sort of see this
0:00:41 from his young chess playing days,
0:00:46 is he understands fintech and incentives
0:00:48 and pricing backwards and forwards.
0:00:52 And so fintech has this hidden infrastructure
0:00:54 on how do credit card transactions work?
0:00:56 How do bonds get sold?
0:00:58 How are insurance policies priced?
0:01:02 And there’s deep economic theory behind all of these
0:01:04 and Alex understands them all.
0:01:05 So you’re gonna have a fun time
0:01:08 as Alex takes you through his encyclopedia of knowledge
0:01:10 of how these things are put together.
0:01:12 And so excited to have you.
0:01:13 – Yeah, it’s great to be here.
0:01:15 – So what I wanted to talk to you about is
0:01:19 I’m gonna pretend to be in the seat of a,
0:01:21 let’s call it an incumbent fintech company, right?
0:01:26 So I’m a product manager and visa or a Geico.
0:01:30 And I am looking in my rear view mirror
0:01:33 and there are startups in the rear view mirror.
0:01:35 And I’m very nervous that the startup
0:01:39 in the rear view mirror, exactly as the mirror says,
0:01:41 objects in mirror maybe closer than they appear,
0:01:44 is like, wow, they are catching up to me faster
0:01:45 than I really want.
0:01:49 And so I wanna understand like, what are startups doing?
0:01:53 Like how would they mount an attack on me, the incumbent?
0:01:56 And we’re gonna talk about sort of wedges they can use.
0:01:58 And then that’s sort of the first half,
0:02:00 like how are they coming after me?
0:02:02 And then the second half, let’s talk about like,
0:02:03 and what should I do about it?
0:02:05 So that’s sort of the premise for our,
0:02:09 so why don’t we start with the attacks?
0:02:11 Like how would a startup come for me?
0:02:14 And one way they come for me is they come
0:02:16 after my best customers.
0:02:19 – Well, so this is the interesting thing
0:02:21 about financial services in general,
0:02:25 because there’s a sharp television hanging on the wall.
0:02:27 And Sharp knows that they make more money
0:02:30 every time they sell an incremental television.
0:02:33 So more customers equals more money, cause, effect.
0:02:35 And the interesting thing is that for many kinds
0:02:37 of financial services, that is not true.
0:02:39 Because what you’re really trying to do
0:02:41 is assemble a risk pool.
0:02:43 And the best example of this is insurance.
0:02:45 So what is car insurance?
0:02:48 Car insurance has good drivers, okay drivers,
0:02:50 and bad drivers.
0:02:53 And effectively, your good drivers and your okay drivers
0:02:56 are paying you every month to subsidize the bad drivers.
0:02:57 So the same thing goes for health insurance.
0:02:58 You have people that are always sick,
0:03:00 you have people that are always healthy.
0:03:02 And if you are an insurance company
0:03:05 that only provided insurance for very, very sick people,
0:03:07 or if you’re a car insurance company
0:03:10 that only ensures people that get into accidents every day,
0:03:13 there’s no economic model to sustain that.
0:03:16 You actually have to accumulate the good customers
0:03:18 and use them to pay for the bad customers.
0:03:20 And the interesting thing about this is that
0:03:22 from the perspective of the good customer,
0:03:24 it’s not fair.
0:03:26 And I’m not talking morally or philosophically,
0:03:30 but just from a capitalist or economic viewpoint,
0:03:32 it’s like, okay, I want life insurance
0:03:35 and I eat five donuts a day.
0:03:35 I just had a donut today.
0:03:36 I don’t eat five a day.
0:03:40 But I have one donut every Friday as you can testify.
0:03:43 And then I have a friend who goes to the gym five times a day,
0:03:45 never eats a donut.
0:03:47 That guy’s probably gonna live longer than me.
0:03:50 Hopefully not, but probabilistically,
0:03:53 he’s probably going to have a better time than I am
0:03:54 in terms of life expectancy.
0:03:57 So why is it that we both pay the same rate?
0:04:00 And that just seems unfair to him.
0:04:03 It seems great to me because he’s subsidizing me.
0:04:05 – Yep, Jim Guy, subsidizing donut guy.
0:04:06 – Exactly, exactly.
0:04:08 And that seems unfair.
0:04:11 And then the startups can sometimes exploit
0:04:13 that psychological unfairness,
0:04:15 like that feeling of unfairness.
0:04:16 So, and it kind of does two things
0:04:19 because from the big company perspective,
0:04:20 if you were to take away,
0:04:22 think of it as a normal distribution.
0:04:24 So most people are in the middle
0:04:26 and they’re just gonna live whatever
0:04:30 to the average of 79.6 years or whatever it is right now.
0:04:31 Some people are gonna live forever.
0:04:34 They’re the ones that have the olive oil go to the gym
0:04:36 and do whatever it is that they do
0:04:38 that makes them live a long time, great genes.
0:04:40 And then some people are gonna die early.
0:04:43 And from the perspective of the startup,
0:04:44 if you can get all of the people
0:04:47 that are going to live much, much longer,
0:04:49 you’re going to be more profitable.
0:04:50 The same thing for car insurance.
0:04:52 If you can get all the people on the good end
0:04:55 of that distribution curve, you’re going to make money.
0:04:57 And then the nice thing is that
0:04:59 if you’re starting a brand new company and saying,
0:05:01 “Hey, I give you a loan if you can’t get a loan.
0:05:02 “Who’s gonna sign up for that?
0:05:04 “People who might be bad.”
0:05:05 If I say, “I’m gonna give you insurance
0:05:07 “if you can’t get insurance.
0:05:08 “Who’s gonna sign up for that?
0:05:10 “The people that are eating all the donuts.”
0:05:11 And that might not be very good.
0:05:15 So it actually has this nice kind of symbiosis
0:05:17 between if you do it correctly,
0:05:19 you get positive selection bias
0:05:22 and that you establish a new criteria.
0:05:24 Part of that new criteria is based on data,
0:05:26 but part of it is based on psychology.
0:05:28 The psychology is I’m treated unfairly.
0:05:30 I want to be treated more fairly.
0:05:32 That yields a lower price for people
0:05:35 for a pretty demand elastic product.
0:05:37 So I say, “I can get life insurance at half the rate
0:05:38 “because I’m going to the gym.
0:05:39 “That sounds great.
0:05:40 “That sounds fair.”
0:05:41 But to answer your question,
0:05:44 what the incumbent might be left with
0:05:47 is not half of the number of customers.
0:05:48 Like that could be the case.
0:05:50 It could be half the number of customers,
0:05:51 but it could be half the customers
0:05:54 and all of them are entirely unprofitable.
0:05:55 – Right, they took all the profits.
0:05:57 They didn’t have to take all your customers.
0:05:58 They just had to take the good ones.
0:06:00 – Right, so actually, if you just take,
0:06:03 and the funny thing is that because it’s not like
0:06:05 I want to get, “Oh, Geico has X million customers.
0:06:07 “I want X plus one million customers.”
0:06:08 You actually might want one tenth
0:06:10 as many customers as Geico.
0:06:12 Because if you can just get the good ones,
0:06:15 I mean, what if you give people a 50% discount,
0:06:17 not a 15% discount, like Geico always advertises about,
0:06:20 but a 50% discount on their car insurance,
0:06:23 and these are the absolute best drivers in the country,
0:06:26 how many claims do you have to pay out on the best drivers?
0:06:30 You might have to pay out nothing, literally nothing.
0:06:31 And if you have to pay out nothing,
0:06:33 and there are all these mandatory loss ratios
0:06:34 for different insurance industries,
0:06:35 so I don’t want to get into that.
0:06:39 But imagine that unregulated, you can pay out nothing.
0:06:42 Consumers feel like they’re treated very fairly.
0:06:44 They’re rewarded for better behavior.
0:06:48 This begets positive selection and not adverse selection,
0:06:51 then you’re going to have the most profitable lending company
0:06:53 or insurance company in the world,
0:06:55 because it really is a unique industry
0:06:57 where more customers is actually worse
0:07:00 than less but more profitable customers
0:07:03 because each incremental customer is like a coin flip
0:07:04 of profit or loss.
0:07:07 Might generate profit, might generate loss.
0:07:09 And that’s not true for the vast majority of industries.
0:07:11 Like Ford never sells a car saying,
0:07:13 “Maybe we’ll lose money on this customer.”
0:07:14 – Right, right.
0:07:17 They just like, “I need everybody to buy a Ford F-150.”
0:07:20 And if you don’t buy an F-150, I need you to buy.
0:07:22 There’s other thing that said, the expedition or whatever.
0:07:24 – They might lose money on the marginal customer
0:07:26 until they hit their fixed costs.
0:07:28 But they’re never going to have a coin flip
0:07:29 of when they sell the car.
0:07:31 Hmm, maybe we shouldn’t have sold that car,
0:07:33 but that’s what every insurance company has
0:07:34 when they underwrite a policy.
0:07:37 And that’s what every bank has when they underwrite a loan.
0:07:39 – Yeah, so auto insurance companies need to find people
0:07:41 like me, I have this old Prius, right?
0:07:45 First, it’s hugely reliable car.
0:07:47 And then I drive like a grandma
0:07:49 because I’m optimizing for fuel efficiency.
0:07:52 So I rarely go above 65.
0:07:54 And so really safe, I’ve never filed a claim.
0:07:57 They need more customers like me.
0:07:58 And that’s what drives the profits.
0:07:59 – Yes.
0:08:00 – ‘Cause there’s no payouts.
0:08:01 – Well, not only does it drive the profits,
0:08:05 it actually subsidizes the losses.
0:08:08 Because there are a lot of people who are the inverse of you
0:08:09 and you’re paying for those people
0:08:11 and the transfer mechanism is through GEICO.
0:08:12 – Yeah.
0:08:16 I saw an ad in my Facebook feed recently
0:08:17 for HealthIQ and I think they’re doing something
0:08:18 like this too, right?
0:08:20 So the, I think the proposition was,
0:08:22 hey, can you run on my own in less than nine minutes?
0:08:24 Can you bench press your own weight or something like that?
0:08:27 There’s all these like, oh, healthy people.
0:08:29 And is that the mechanism they’re exploiting?
0:08:30 – It’s exactly that.
0:08:32 I would say the first company to probably do this
0:08:36 on a widespread basis in FinTech land was SoFi.
0:08:38 And SoFi said, hey, you’re really smart.
0:08:39 They actually coined this term.
0:08:40 They called it the Henry.
0:08:42 High earning, not rich yet.
0:08:44 Because if you look at how student loans work,
0:08:46 it’s like everybody gets the same price
0:08:48 on their student loan, right?
0:08:49 It doesn’t matter what your major is.
0:08:54 It doesn’t matter what your employment prospects think.
0:08:56 What your employment prospects are,
0:08:57 everybody gets the same rate.
0:09:00 You get this rate, you get this rate, you get this rate.
0:09:01 Because a lot of it is effectively underwritten
0:09:02 by the US government.
0:09:04 And that’s not, so think about it again
0:09:07 from the twin pillars of psychology.
0:09:09 Where, I mean, psychology of the borrower.
0:09:12 Like how come I’m paying the same rate
0:09:14 as that person who’s going to default?
0:09:15 That’s just not fair.
0:09:17 I’m never going to default.
0:09:20 In fact, I’m gonna pay back my student loans early.
0:09:22 So that helped.
0:09:24 And then again, positive selection
0:09:26 versus adverse selection because,
0:09:29 and actually refinance has this concept in general.
0:09:30 Because I would say, if you’re planning
0:09:33 on declaring bankruptcy, or if you’re saying,
0:09:36 I’m going to, I’m gonna join Occupy Wall Street
0:09:38 and never pay back my loans and I hate capitalism,
0:09:40 why would you go refinance?
0:09:41 It just doesn’t make sense.
0:09:42 – Right.
0:09:44 – Because you’re just gonna default.
0:09:44 – Right.
0:09:46 – So if you raise your hand, and actually it’s interesting,
0:09:48 even on the other side, there are a lot of companies
0:09:50 in what I would call the debt settlement space.
0:09:52 And this is something that most people don’t know about.
0:09:55 But if you listen to like some interesting talk radio,
0:09:58 you’ll hear all these ads for debt settlement.
0:09:59 And what is debt settlement?
0:10:01 It’s saying, hey, do you have too much debt?
0:10:05 If you call us, we will negotiate on your behalf
0:10:08 and pay off your debts, and then you just owe us.
0:10:10 And you kind of need this intermediary layer
0:10:13 because imagine that you owe $10,000 to Capital One
0:10:15 and you can’t pay it back.
0:10:16 You call it Capital One.
0:10:18 It says, press one for your balance.
0:10:20 Press two to get a new card mail to you.
0:10:22 Press three if you don’t want to pay us the full amount
0:10:23 and want to pay us less.
0:10:26 Everybody’s gonna push through, right?
0:10:27 – This is why they don’t offer that option.
0:10:29 – They don’t offer that option, nor will they ever.
0:10:34 However, on talk radio, and this is very big in the Midwest,
0:10:37 like you’ll hear, you know, Freedom Financial.
0:10:39 Go call Freedom Financial and we will settle
0:10:40 your debts for you.
0:10:42 So they call Capital One and say, look,
0:10:43 Alex can’t pay you back.
0:10:45 We’ll pay you $2,000 right now,
0:10:47 and then you’re gonna get rid of the loan.
0:10:48 And you’re like, well, we’re not happy
0:10:50 taking 20 cents on the dollar,
0:10:52 but it’s better than zero cents on the dollar, fine.
0:10:53 We’ll take it.
0:10:57 And then you owe Freedom Financial the 20 cents.
0:11:00 But why do they feel comfortable underwriting that?
0:11:03 Because you rose your hand, you said,
0:11:05 I want to get out of debt.
0:11:08 And that’s positive selection bias right there.
0:11:10 Because people who are just deadbeats,
0:11:12 because behind every credit score,
0:11:13 if you think about how that works,
0:11:15 it’s willingness and ability to repay.
0:11:18 And the psychological trait of the willingness
0:11:20 is in many cases as important
0:11:22 as the financial constraint of the ability.
0:11:25 Because if I owe a million dollars to somebody,
0:11:28 and I only make $100 a year,
0:11:29 it doesn’t matter how honest I am,
0:11:31 I can never pay that back.
0:11:33 It doesn’t matter how long I’m gonna live 10,000 years
0:11:34 and I guess I could pay it back.
0:11:36 But otherwise I can’t pay that back.
0:11:38 But the willingness to repay is interesting.
0:11:40 And that’s very important.
0:11:42 And that’s again, this kind of psychological trait
0:11:45 that’s captured in this idea of positive selection.
0:11:46 So what does SoFi do?
0:11:48 They kind of again hit this twin pillar,
0:11:52 which is I want to only get the good customers,
0:11:55 I’m going to reprice them and steal them
0:11:58 from the giant pool that again, normal distribution,
0:12:00 these are the losers, these are the whatever’s,
0:12:03 and these are the people that you have no risk on whatsoever.
0:12:06 Let’s steal all of these people over here.
0:12:07 And it makes them feel good.
0:12:09 It’s a better marketing message.
0:12:11 At least it’s differentiated.
0:12:13 How do you compete with everybody?
0:12:14 It’s like, hey, we’re just like Chase,
0:12:17 but smaller and a startup and not profitable
0:12:20 and you probably shouldn’t trust us, bad marketing message.
0:12:22 Good marketing message is you’re getting ripped off.
0:12:26 We’re going to price you fairly, come to us.
0:12:29 So if I did this for lending.
0:12:30 – And what a health IQ do?
0:12:32 – So a health IQ did this for health,
0:12:33 really for life insurance.
0:12:35 So they started off with a health quiz
0:12:38 because I mean, it seems almost self-evident
0:12:40 that healthy people are health,
0:12:42 I mean, it’s a tautology,
0:12:44 like healthy people are healthier than not healthy people,
0:12:47 but can you actually prove this
0:12:48 from a life expectancy perspective?
0:12:50 So they started off with just recording data
0:12:53 and then building a mortality table.
0:12:56 And it turned out that what I would assume
0:12:59 is a prima facie case turned out to actually be correct,
0:13:01 which is these healthier people do live longer
0:13:03 than not healthy people.
0:13:06 And then they turn that into both a positive selection
0:13:07 advertising campaign,
0:13:09 which differentiated them from a brand perspective,
0:13:12 but also left them more profitable.
0:13:13 So what they do is they say, yeah,
0:13:16 can you run a nine or an eight minute mile?
0:13:19 Can you do these things to prove
0:13:21 that you’re better than everybody else?
0:13:22 And why is that important?
0:13:24 Well, from their own balance sheet
0:13:25 or profitability perspective,
0:13:27 they want to get these good customers
0:13:30 versus a brand new life insurance company
0:13:32 that said, hey, life insurance takes too long to get,
0:13:34 it’s a big pain and it’s expensive,
0:13:36 we’ll underwrite you on the spot in one minute,
0:13:39 no blood test, that’s gonna be adverse selection.
0:13:42 That’s like, ooh, I think I’m gonna die soon.
0:13:44 I want to get, everybody rejected me for life insurance,
0:13:45 I’m going to that company.
0:13:48 As opposed to here, they’re only getting the customers
0:13:50 that kind of hit,
0:13:52 that think they’re gonna hit the underwriting standard,
0:13:53 which is great.
0:13:55 They think it’s fair.
0:13:57 So it’s a differentiator from a brand perspective.
0:14:00 And then it turns out that, again,
0:14:02 each marginal customer in insurance
0:14:03 is kind of a coin flip.
0:14:05 They’re getting a weighted coin
0:14:06 because they’re only getting people
0:14:09 on the far right side of this normal distribution.
0:14:14 – So wedge number one is exploit psychology, right?
0:14:16 Positive selection rather than negative selection
0:14:18 and what you’ll end up with
0:14:20 because of this sort of unique dynamic
0:14:21 of the FinTech industry
0:14:24 is you’ll end up with the most profitable customers.
0:14:25 What’s wedge number two?
0:14:27 We’re gonna talk about sort of new data sources
0:14:29 and what startups can do
0:14:31 to sort of price their products smarter than incumbents.
0:14:36 – Right, so imagine that you have a group of 100 people
0:14:38 and of the 100 people,
0:14:40 half of them are not going to pay you back.
0:14:43 So think of this as the old combinatorics problem
0:14:44 of bins and balls.
0:14:46 So you’ve got this giant ball pit,
0:14:48 you scoop up 100 balls in your bin
0:14:50 and half of them are going to be bad,
0:14:53 half of them are going to be good.
0:14:56 So what’s a fair rate of interest if you’re a lender,
0:14:59 that you have to charge this whole bin
0:15:01 if half of them are going to default
0:15:03 and you assume that you can’t lose money?
0:15:05 The answer is going to be 100%.
0:15:08 – Oh right, because half of them you have to make up
0:15:09 for all the deadbeats.
0:15:10 – So half of them, you lose all of your money,
0:15:12 half of them you double your monies,
0:15:13 you’re back to square one.
0:15:14 – Now you’re even.
0:15:15 – Now you’re even.
0:15:18 So the problem is that that’s not good
0:15:19 because well in the United States
0:15:21 you can’t charge 100% interest.
0:15:24 It’s called usury, there are other parts of the world,
0:15:28 again, illegal, step one, Europe, so that’s a problem.
0:15:32 But what if you can use different data sources to,
0:15:36 again, it’s not positive versus adverse selection
0:15:38 as in some of the insurance companies,
0:15:41 but it’s saying can I collect more forms of data
0:15:43 so that instead of saying the only way
0:15:45 that I can make my operation work
0:15:46 is to charge an interest rate
0:15:49 which actually turns out to be illegal,
0:15:51 can I come up with more data sources
0:15:53 that effectively, even though discrimination
0:15:55 sounds like a terrible word,
0:15:57 and it’s normally used in that construct,
0:15:59 if you discriminate against criminals that’s fine.
0:16:01 I mean some of the people that try to take advantage
0:16:04 of lenders are actual organized crime.
0:16:05 You don’t want them in your bin,
0:16:06 you want to throw them out.
0:16:09 How do you take more data sources
0:16:11 and actually start measuring this?
0:16:12 And the interesting thing here,
0:16:13 and it’s somewhat unfortunate,
0:16:17 but you have a giant market failure happening
0:16:18 in many different regions of the world
0:16:19 because in the United States,
0:16:21 like the top interest rate that you can charge,
0:16:22 it’s regulated on a state by state basis,
0:16:25 but Utah has a 36% usury cap,
0:16:28 so a lot of people export that cap.
0:16:30 That’s a lot less than 100% that I was mentioning.
0:16:32 And there are lots of ways of kind of gaming that system
0:16:34 and you charge late fees and you charge this fee,
0:16:36 so it actually might end up looking more like 100
0:16:39 or 200%, but so you can charge more than 36%.
0:16:42 And then you actually can’t use certain types of data
0:16:44 if they are prone to having an adverse impact.
0:16:46 So if you think about how machine learning works,
0:16:47 I always kind of describe it
0:16:50 somewhat oversimplistically as linear algebra,
0:16:53 where I have, here’s every user that I’ve ever seen,
0:16:55 here’s every attribute that I’ve ever measured,
0:16:57 and what I’m looking for is like strange correlations
0:16:59 that I can’t even explain.
0:17:01 So I’m gonna ask you,
0:17:02 I’m not even gonna ask you a lot of these things,
0:17:04 it’s like, how long did you fill out this field for
0:17:06 on my loan application?
0:17:08 Did you enter all caps or not all caps?
0:17:09 Just all of these different things.
0:17:11 – Did you take the slider on how much do you want
0:17:12 and jam it all the way to the right?
0:17:13 – Right, all of these things.
0:17:16 – I can ask you, do you have a pet or not?
0:17:16 That might be interesting.
0:17:18 I don’t know if that’s a leading indicator
0:17:20 of default or not, but I wanna collect
0:17:21 all these different variables,
0:17:22 and then at the end of the day,
0:17:24 I’m going to see default or not default.
0:17:26 That’s the output, and then I’m going to see
0:17:27 what’s correlated with that.
0:17:29 And it’s a little bit of this, it’s a little bit of that,
0:17:31 I can’t explain it, but the computer can’t.
0:17:33 Now the problem is that in the United States,
0:17:35 you actually can’t do this,
0:17:37 because it might have an adverse impact.
0:17:39 And what does an adverse impact mean?
0:17:42 There actually was outright and terrible discrimination
0:17:44 in lending in the United States,
0:17:46 where there’s unfortunately terrible discrimination
0:17:47 in many things in the United States,
0:17:50 but lending was one of several or one of many.
0:17:54 So imagine that I said, are you married or not?
0:17:57 Oh, you’re not married, I’m not gonna make you alone.
0:17:59 Well, that’s illegal now.
0:18:00 Are you this race?
0:18:01 Oh, I’m not going to make you alone.
0:18:02 Well, that’s illegal now.
0:18:03 So what did people do to get around,
0:18:06 the people that were actual racists,
0:18:08 or actual, like maybe they weren’t racist
0:18:09 or discriminatory at heart,
0:18:11 but they were picking up on cues.
0:18:13 They’d say, oh, what part of town do you live in?
0:18:15 Well, you live on that part of town.
0:18:19 Well, that’s like 100% correlated with this race,
0:18:20 or this gender, or this, that.
0:18:22 I’m not going to make you alone.
0:18:23 So the law was strengthened,
0:18:26 so there’s a law called Fair Lending in the United States.
0:18:27 And then one of the components of it
0:18:30 is this idea of called adverse impact.
0:18:31 And it’s different than adverse selection.
0:18:34 It’s saying, I don’t care what you said you did
0:18:38 for why you rejected Frank for a loan.
0:18:41 If it turns out that everybody in your reject pile
0:18:45 has a disproportionate gender ratio, race ratio,
0:18:46 something like that,
0:18:49 I’m going to assume that you’re underwriting standards
0:18:50 or having an adverse impact.
0:18:53 So you as a bank couldn’t say,
0:18:55 hey, look, I asked him if he had cats.
0:18:58 And I’m using that to make the loan decision.
0:18:59 If it turned out that having cats
0:19:04 was correlated with being in particular race,
0:19:09 they couldn’t use the cat’s answer to deny you a loan.
0:19:10 – Correct, because that was,
0:19:12 and in all fairness to the law,
0:19:15 this is what people use with your geography.
0:19:16 What zip code do you live in?
0:19:18 Oh, you live in that zip code?
0:19:21 100%, you were a member of this particular race,
0:19:23 and the intent all along was to discriminate
0:19:25 against people of that particular race.
0:19:28 But now instead of using loan officers that use,
0:19:30 God knows what to decide.
0:19:31 Do I want to make you the loan or not?
0:19:33 You’re using a computer, you can look at the code.
0:19:35 So I think there is a lot of,
0:19:38 there are some anachronistic laws
0:19:39 that have to catch up here,
0:19:42 but let’s take an area outside of the US
0:19:44 to answer your question,
0:19:47 where perhaps you don’t have interest rate caps,
0:19:49 because the thing that a lot of people say,
0:19:51 oh, 200% interest is terrible.
0:19:55 500%, that sounds awful, you should go to jail for that.
0:19:56 But what does APR mean?
0:19:59 APR stands for annual percentage rate.
0:20:01 And what if I’m giving you a four day loan?
0:20:04 So I say, okay, I’m gonna loan you $9 right now.
0:20:07 You don’t look very trustworthy.
0:20:09 I want you to pay me back $10 on Monday.
0:20:11 – Yeah, that doesn’t sound so bad.
0:20:13 – Yeah, it’s like, you’re gonna pay me a dollar.
0:20:16 But what is that on an APR basis?
0:20:19 That’s like 9,000% made that up.
0:20:20 But it’s probably about that, right?
0:20:23 Because it’s 10% every four days, or every three days,
0:20:25 10% every three days, and that cumulates.
0:20:28 Like that’s a lot of money or a lot of interest
0:20:31 on an APR basis, but it’s the wrong metric
0:20:33 because effectively it’s like trying to figure out
0:20:37 what your marathon time is based on your 100 meter dash.
0:20:40 Like the winning marathon time would be an hour.
0:20:41 And that’s not true.
0:20:42 We know that nobody can run on that,
0:20:44 two hour marathon right now.
0:20:45 Yeah, so maybe Angela can.
0:20:46 – Maybe Angela.
0:20:50 – So there’s a company that we invested in called Branch.
0:20:52 And what they’re doing is they just collect
0:20:54 every form of data possible.
0:20:57 And they look for these strange correlations.
0:21:00 And the interest rates on an APR basis might be high,
0:21:02 but they’re really charging like a dollar.
0:21:04 – And these are small loans, right?
0:21:05 – Very, very small loans.
0:21:07 So I loan you, and actually the other interesting,
0:21:09 like one of the nice data points
0:21:11 that they’re accumulating over time
0:21:15 that is a really interesting idea, I think.
0:21:16 It’s not new.
0:21:18 In fact, it’s almost back to the future old
0:21:19 where they loan you a dollar.
0:21:20 If you pay it back, they loan you $2.
0:21:22 If you pay it back, they loan you $4.
0:21:24 If you pay it back, they loan you $10.
0:21:26 And they ladder up your credit
0:21:29 and they keep that information proprietary to them.
0:21:33 Because induction turns out to be a pretty good formula
0:21:36 for figuring out not so much the ability to repay,
0:21:37 but the willingness to repay.
0:21:40 You’ve established a pattern of willingness to repay,
0:21:43 but they also look at where were you today.
0:21:46 And again, you provide all this information
0:21:47 in order for them to crunch this,
0:21:49 in order for them to give you a loan
0:21:50 at ideally a lower rate.
0:21:51 Because the more information,
0:21:53 because it’s kind of twin pillars, right?
0:21:55 The less information we have,
0:21:57 the higher the rate that we have to charge.
0:21:58 Not because we’re evil,
0:22:00 but because otherwise you’re gonna have a market failure.
0:22:01 Like you have in lots of-
0:22:02 – You have the bin ball problem, right?
0:22:03 – Exactly. – Because you have no idea
0:22:04 how many deadbeats.
0:22:05 – Exactly.
0:22:07 And if I don’t have any idea,
0:22:08 I either have to charge a high rate
0:22:10 or not charge anything at all.
0:22:11 And not charge anything at all
0:22:13 doesn’t mean like everybody gets a 0% loan.
0:22:14 It means I don’t make any loans.
0:22:16 And like both of those are bad outcomes.
0:22:19 The better outcome is you accumulate more data
0:22:21 and you figure out here are the good people,
0:22:23 let me not accept the bad people.
0:22:25 Because again, the way that the good people
0:22:27 end up paying more money
0:22:29 is if the company starts accepting more bad people
0:22:32 because it goes back to what I said at the beginning,
0:22:35 which is more customers in this unique industry
0:22:38 often is bad if you don’t understand
0:22:40 how to select them correctly.
0:22:42 And for many of these new fangled lending
0:22:43 and insurance companies,
0:22:46 the default customer is going to be adversely selected
0:22:47 because if you’re a new lender
0:22:49 and you have no underwriting standards,
0:22:53 you’re basically advertising free money never pays back.
0:22:54 And those are the people that will be attracted to you,
0:22:57 both the criminals and the non criminals in droves.
0:23:01 – Yeah, so this is sort of startup attack wedge number two,
0:23:03 which is I’m going to generate a new data source
0:23:06 that allows me to price my product in a way
0:23:09 or reach a customer that a traditional company
0:23:11 would never even try or they don’t have the data source
0:23:13 or they have the bid and ball problem.
0:23:16 So what are the types of data that branch went to go get
0:23:17 to try to figure out,
0:23:19 should I give you a loan of a dollar or two?
0:23:21 – Well, the other type of data,
0:23:23 so branch was somewhat unique
0:23:27 in that they said we’re going to get data from your phone.
0:23:29 And it seems odd.
0:23:33 He’s like most lenders in the developed world
0:23:35 or not developed versus undeveloped,
0:23:37 it’s really like with developed credit infrastructure.
0:23:39 – Yeah, if there’s a credit bureau.
0:23:41 – They look up your credit report, if it’s good,
0:23:43 they make you a loan, if it’s bad, they don’t make you a loan.
0:23:45 It’s actually not that hard.
0:23:47 And there are all sorts of nuances that you can layer on top,
0:23:49 but this is how it’s been working for a long time
0:23:51 in the United States as an example.
0:23:55 Whereas there, it was like, okay, where did you work today?
0:23:57 Did it look like you worked today?
0:23:59 So it was stuff like that
0:24:01 and even like how many apps do you have on your phone?
0:24:04 Like weird stuff that you would never assume
0:24:07 actually has any kind of indication
0:24:09 of willingness or ability to repay,
0:24:10 but in many cases it does.
0:24:12 Like are you gambling?
0:24:14 Well, if you have a gambling app on your phone,
0:24:17 you’re probably gambling, maybe that’s good.
0:24:18 Yeah, maybe it’s bad.
0:24:20 It’s actually not making human judgments.
0:24:21 And it’s also not looking at any one
0:24:24 of these unique variables as a unique variable.
0:24:26 It’s looking at them in concert
0:24:29 and then correlating them with these outcomes
0:24:31 or really observing the outcomes
0:24:33 and then linking them back to all of these different inputs.
0:24:35 – Yeah, I remember talking to the team
0:24:38 when I was researching my last machine learning presentation
0:24:41 and the fascinating things that I found were
0:24:45 if you’ve got more texts than you sent,
0:24:46 you were more credit worthy.
0:24:47 If you had the gambling app,
0:24:49 you were more credit worthy rather than less,
0:24:52 which is not kind of what you would expect.
0:24:54 If you burned through your battery,
0:24:56 you were more likely to default, right?
0:24:59 So like all of these things where human alone officers
0:25:02 would never really guess,
0:25:03 and they probably would guess the wrong way.
0:25:05 – Right, because many of them are counterintuitive.
0:25:08 And then many of them are not, they’re not unilateral.
0:25:10 Like so it’s not just, I mean, I don’t know,
0:25:12 but it’s not just the battery thing,
0:25:14 it’s the battery thing with this, with that, with that.
0:25:15 – It’s the combinations.
0:25:17 – And it’s like humans can only really observe
0:25:20 three dimensions plus time, so I guess four,
0:25:23 and these are 9,000 dimensional problems.
0:25:25 So it’s just, it’s much, much more challenging
0:25:27 for humans to really grok.
0:25:28 – Yeah, got it.
0:25:31 So that’s the sort of the second category of attack,
0:25:33 which is you generate a new data source
0:25:36 and then that allows you to price or find customers
0:25:38 in sort of a more cost effective way.
0:25:40 Let’s talk about the third,
0:25:45 which is around sort of fundamentally changing behavior.
0:25:47 So why don’t you talk about,
0:25:49 maybe Ernie is a good example of this?
0:25:52 – Yeah, so if you assume that humans are static,
0:25:55 so they’re born, both of our cameras were born,
0:25:58 and their DNA is set upon birth.
0:25:59 Maybe it changes a little bit with some mutations
0:26:01 from some gamma rays here and there,
0:26:03 but it’s set upon birth,
0:26:05 and then human behavior never changes.
0:26:06 And that’s one way of looking at things.
0:26:08 And then you think about adverse selection
0:26:09 versus positive selection.
0:26:11 Good drivers are always good drivers.
0:26:12 Bad drivers are always bad drivers.
0:26:14 Let’s just get the good drivers.
0:26:16 So the other category,
0:26:17 and it’s not to say that these other two groups
0:26:18 don’t do this,
0:26:20 but if I look at a company like Ernie,
0:26:24 most payday lenders are reviled
0:26:25 because they charge high fees,
0:26:29 they don’t educate their borrower very well.
0:26:31 Now it actually provides a valuable service
0:26:33 because if I’m getting paid next Friday,
0:26:36 but my rent is due today and I don’t have money,
0:26:37 do I want to get evicted?
0:26:41 No, I want to get paid right now,
0:26:43 and the only person that does this is the payday lender,
0:26:46 but the payday lender is competing with other payday lenders
0:26:49 for advertising in the local newspaper or something,
0:26:51 and if they’re able to rip me off more,
0:26:52 not because they’re evil,
0:26:55 but because they have to afford the advertising spot,
0:26:56 they’re now ascended to do so.
0:26:59 So it’s just, it’s a vicious cycle.
0:27:00 So let’s talk about Ernan.
0:27:02 So what Ernan does is they say,
0:27:05 okay, we know that you’ve worked this long.
0:27:09 So again, new data source because the phone’s in your pocket
0:27:12 and you work at Starbucks and you’re getting paid hourly
0:27:13 and we’ve seen the phone in your pocket
0:27:16 or in your locker in the Starbucks office
0:27:19 and you’re by the barista counter for eight hours.
0:27:21 So you worked, we saw your last paycheck,
0:27:22 hit your bank account,
0:27:24 we know that that’s where you work,
0:27:25 we’re not taking your word for it,
0:27:27 we have real-time streaming information about this,
0:27:33 and now we will give you your money whenever you want.
0:27:35 Not money that you haven’t earned yet,
0:27:36 but money that you have earned,
0:27:38 but you actually haven’t gotten paid for yet.
0:27:41 And then you can tip us, there’s no cost.
0:27:44 If you want, you can get us– – No interest, no fee, no–
0:27:45 – If you want to pay us nothing, that’s fine.
0:27:47 I mean, we would appreciate if you pay us something
0:27:49 because obviously we’re providing valuable service for you.
0:27:52 And then you can even give tips for your friends,
0:27:54 there’s this community that’s really emerged
0:27:55 of people on Ernan.
0:27:57 And actually, if you look back at different business model,
0:28:00 but this idea of microfinance in general,
0:28:04 so if you think about Mohammed Yunus and what he did,
0:28:07 this idea of can you encourage people
0:28:11 to pay back loans using social pressure.
0:28:13 So again, not adverse selection versus positive selection,
0:28:16 but actually trying to force everybody
0:28:18 down positive behavior.
0:28:20 – Let’s get the community to encourage repayment.
0:28:24 – Right, because then saying,
0:28:27 or let’s get the community to encourage people
0:28:29 actually driving safely.
0:28:33 Because there’s underwriting at the time of admission,
0:28:36 there’s underwriting based on ongoing behavior.
0:28:38 So like many of the car insurance companies
0:28:41 that are brand new are saying we will re-underwrite you.
0:28:44 Like, yeah, if you drive like Frank when you signed up,
0:28:48 great, but now you switched into like race car driver mode
0:28:49 and you were trying to hack us,
0:28:52 but we’re actually monitoring your speedometer at all times.
0:28:53 So guess what?
0:28:54 You got a higher rate now.
0:28:57 So that might encourage you to drive safely.
0:29:00 If I’m Frank and I drive safely in my Prius,
0:29:03 but then I decide and then I got a really good rate
0:29:05 on my car insurance as a result.
0:29:07 And now I’m like, aha, I game the system.
0:29:10 Now I’m going to drive like a maniac.
0:29:12 Well, the nice thing is that you can make
0:29:14 underwriting dynamic and you can say, all right,
0:29:17 we’re actually going to re-underwrite you every day.
0:29:18 So we had the positive selection
0:29:20 to try to attract the Franks.
0:29:24 We have the continuous evaluation to try to encourage
0:29:27 the right behavior post Frank sign up
0:29:30 and also to stop the gamification of it’s like,
0:29:33 I’m gonna pretend to be safe and then be like a maniac.
0:29:35 But then how do you actually get?
0:29:38 What if Frank was a bad driver initially,
0:29:42 doesn’t fall into my positive selection loop,
0:29:47 but I still want to try to make Frank a better driver.
0:29:48 – If I could turn him into a good driver,
0:29:49 he’d be profitable.
0:29:52 – Right, so because that’s the flaw
0:29:55 with kind of wedge one and wedge two
0:29:57 of like creaming the crop, really wedge one,
0:29:58 which is we’re going to cream the crop.
0:29:59 We’re going to do what SoFi did.
0:30:00 We’re going to do with health IQ.
0:30:03 I mean, it’s a great strategy,
0:30:07 but the rest, again, if you assume that it’s all nature
0:30:08 and there’s no nurture,
0:30:10 then perhaps there’s nothing you can do.
0:30:13 But if you can actually try to nurture better behavior,
0:30:17 you actually see better, you do see better behavior
0:30:19 and then the profitability goes up.
0:30:21 So, and the interesting thing there
0:30:24 is that you’re still finding mispriced customers,
0:30:26 but you’re actually helping turn them
0:30:28 into correctly priced customers.
0:30:31 So somebody like a bank would turn away that customer
0:30:33 and say, we don’t want them
0:30:37 because they have a 500 FICO, which is really bad.
0:30:39 And then you have to figure out,
0:30:42 and as with all of the new startups
0:30:43 that are saying we only want the best customers,
0:30:46 we want to leave the banks with the bad customers,
0:30:48 but it’s kind of the twin pillars of
0:30:52 can you identify something that’s below that credit score
0:30:54 or below that driving score or something,
0:30:56 and then can you encourage positive change?
0:30:58 And if you can, then you can start actually
0:31:02 creaming the crop of the bottom half of the customers,
0:31:02 right, not even the bottom half.
0:31:04 It’s the customers that are just neglected
0:31:06 because nobody wants to underwrite them.
0:31:09 And then you do that, you take them on
0:31:12 because you have a secret to change their behavior.
0:31:14 – Right, you’re seeing a lot of companies
0:31:18 that sort of are using behavioral economics research
0:31:22 to figure out how do I nudge people into better behavior.
0:31:23 And so this would be an example
0:31:26 of how you’re trying to change behavior
0:31:28 to get the profitable customer.
0:31:31 – Right, so there is one company
0:31:33 in the lending space a while ago called Vouch.
0:31:35 I think ultimately it didn’t work,
0:31:38 but when you apply for a loan,
0:31:41 it actually kind of taps your social network
0:31:44 and it requires that they do a reference for you.
0:31:45 Either a reference in terms of like,
0:31:49 yes, Frank is a good customer, you can trust him.
0:31:52 And even kind of a co-commit.
0:31:56 So I’m getting a loan for $1,000 and you say,
0:31:59 yeah, Alex is okay, or I’m saying Frank is okay.
0:32:01 And if he doesn’t pay you back,
0:32:05 I will put $100 in, because that’s how confident I am.
0:32:07 And it’s not all 1,000, but it’s 100.
0:32:11 And then you’re my friend, I go bowling with you.
0:32:13 We go take our cameras out together.
0:32:16 And if you don’t pay back this $1,000
0:32:19 to this kind of faceless, large, evil corporate entity,
0:32:21 not really, but if you don’t pay that back,
0:32:23 I’m on the hook for a hundred bucks.
0:32:25 I’m not going bowling with you anymore.
0:32:27 So there are other things that are really interesting
0:32:30 to try to encourage the correct form of behavior.
0:32:33 And actually, part of it is just making it personal.
0:32:35 Like this was the whole Eunice theory,
0:32:37 which is if you are kind of held accountable
0:32:41 by your peers, that is so much more powerful
0:32:44 than getting a collections call from Citibank.
0:32:46 Like you’re like, ooh, that’s the collections never,
0:32:48 iPhone block, done.
0:32:50 But how am I going to block my friends out?
0:32:53 If Alex calls me and says you really got to pay
0:32:55 that loan back, otherwise I’m out a hundred bucks, right?
0:32:56 That’s much more powerful.
0:32:58 I mean, this has worked great for a lot of health
0:32:59 in a different domain, right?
0:33:03 Which is if you are trying to get a pre-diabetic patient,
0:33:06 not to get diabetes, the most effective thing to do
0:33:08 is lose something like six or 7% of your body mass.
0:33:11 And the way they do it is they get you into a group.
0:33:14 They mail everybody a scale.
0:33:16 Everybody sees your weight in the morning, right?
0:33:18 Like that’s a powerful motivator.
0:33:20 – Yeah, I mean, this stuff, psychology is very powerful.
0:33:24 So there are a lot of tricks that you can use here.
0:33:27 And if you understand the impact of them,
0:33:29 you actually have to reassess your entire branding
0:33:31 and customer acquisition strategy, right?
0:33:32 – Right, right.
0:33:35 All right, so remember I opened up pretending
0:33:38 to be the product manager at Visa.
0:33:41 And now we’ve gone through all of these three categories
0:33:42 of how the startups are coming for me.
0:33:45 And like, I’m starting to sweat here, right?
0:33:47 They can come get my best customers.
0:33:48 They can generate new data sources
0:33:50 that like I would have a hard time doing.
0:33:53 They can actually even go after sort of worse customers,
0:33:54 change their behavior,
0:33:56 turn them into profitable customers.
0:33:57 I’m scared now.
0:33:59 Like what in the world should I do?
0:34:02 Like you’re in my seat, you’re the head of innovation
0:34:04 or head of strategy or head of digital
0:34:06 at one of these big FinTech companies.
0:34:10 What should I do with respect to startups?
0:34:11 – Well, I think it’s actually very hard
0:34:13 for a company that’s trying to be all things
0:34:15 to all customers.
0:34:16 Because if you look at what SoFi is,
0:34:18 look at SoFi’s brand.
0:34:20 Brand is, you know, we are the high,
0:34:21 like if you’re great, you’re good enough for us.
0:34:22 – If you’re Henry, right?
0:34:24 – If you’re a Henry, you’re good enough for us.
0:34:25 Health like you.
0:34:27 If you’re healthy, you’re good enough for us.
0:34:30 So on that sector of the curve,
0:34:33 how does GEICO say, hey, if you’re a good driver,
0:34:35 go to this special part of GEICO.
0:34:37 If you’re a regular driver, you still save 15%.
0:34:40 If you’re a bad driver and you had a DUI,
0:34:41 well, we can cover you over here.
0:34:46 It’s just, it’s lost in this kind of giant GEICO,
0:34:48 you know, GEICO marketing message.
0:34:52 So in many cases, it actually helps to have sub-brands
0:34:55 and divide this up, which is somewhat anathema
0:34:56 to a lot of companies that want to say,
0:34:59 how do we get as much efficiency and synergy as possible?
0:35:01 We’re gonna have one overarching brand.
0:35:03 And, you know, one of my favorite examples
0:35:04 of this kind of different industry,
0:35:07 but the highest end of the highest end of jewelry
0:35:08 is Tiffany and Co.
0:35:10 Or one of the highest end of the highest end.
0:35:13 And for a long time, it was owned by Avon.
0:35:14 – No, really?
0:35:15 – You know, the Avon lady, Avon.
0:35:19 So, and if Avon bought Tiffany, which they did,
0:35:20 and they said, okay,
0:35:23 we’re gonna rebrand Tiffany and Co as Avon,
0:35:25 like that doesn’t work.
0:35:27 Like you’re not gonna get, you know,
0:35:30 80% gross margins on whatever they sell at Tiffany and Co.
0:35:33 – Prefisted Avon’s just doesn’t have quite the right ring.
0:35:34 – It doesn’t work.
0:35:36 And then for Avon to say, okay, you know,
0:35:39 the door-to-door salesperson or sales lady
0:35:41 with the pink Cadillac that’s going around,
0:35:45 like we’re now going to have her push $2,000 bracelets
0:35:48 as opposed to the normal $10 fare,
0:35:49 like that’s not gonna work either.
0:35:51 But it actually can make sense
0:35:54 if you want to just appeal to more customers,
0:35:55 you have different brands
0:35:57 and you don’t wanna all suck them together.
0:35:59 So you can imagine instead of having, you know,
0:36:01 Geico could be your generic brand,
0:36:03 but then you could have,
0:36:04 I think I mentioned this to you once before,
0:36:05 a friend of mine is Mormon,
0:36:07 doesn’t drink alcohol,
0:36:10 and says we should have Mormon insurance for cars
0:36:11 because it’s just totally unfair.
0:36:13 Again, going back to the psychology point,
0:36:16 like why is it that I’m paying for the drunk idiot
0:36:17 that goes through the stop sign,
0:36:20 I don’t drink, I can prove that, I will never drink,
0:36:21 I have a million friends just like me
0:36:22 that will never drink,
0:36:23 we should all get car insurance,
0:36:25 we should all get a 40% lower rate.
0:36:28 Do they think of Geico when they go there?
0:36:29 Maybe they could,
0:36:32 but it could be like Mormon car insurance,
0:36:33 or something, I’m not good at branding,
0:36:35 but you could have a separate brand
0:36:37 for all of these separate subgroups
0:36:40 and have the same underlying infrastructure
0:36:41 behind all of them,
0:36:44 but again, part of this is just how do you brand
0:36:45 and how do you market effectively?
0:36:48 Because if you look at the efficacy of health IQ ads,
0:36:50 or the efficacy of SOFI ads,
0:36:51 they’re so much higher,
0:36:53 because again, you have this large group of people,
0:36:56 or in many cases small but valuable groups of people,
0:36:58 that feel like they’re being treated unfairly.
0:37:01 So yeah, Geico is save 15% on auto insurance,
0:37:03 click here.
0:37:08 Mormon car insurance advertised to LDS members in Utah,
0:37:11 shooting fish in a barrel,
0:37:13 that’s gonna have a dramatically higher click rate,
0:37:14 and then many of these products
0:37:16 are also very demand elastic.
0:37:19 So I’m not saying save 15% on car insurance,
0:37:22 I’m saying save 80% on car insurance,
0:37:24 it’s very easy to do, click here,
0:37:26 positive selection bias,
0:37:28 that’s gonna work better than like Geico,
0:37:30 but we also have something for Mormons too.
0:37:31 – Right, yeah.
0:37:34 The goal is to find the LDSers and the hyper-myelors
0:37:36 who are really safe, et cetera, et cetera, right?
0:37:38 And so it’s very counterintuitive,
0:37:39 because if you’re at a big company,
0:37:40 you’re thinking scale,
0:37:42 how do I get the next increment
0:37:44 of revenue growth or profit,
0:37:46 and you’re saying actually go the other way,
0:37:48 don’t try to make your single brand bigger,
0:37:50 try to think about a dozen sub-brands,
0:37:54 each going after sort of the perfect market for them.
0:37:57 How do you positively select into a sub-market?
0:37:59 – Well the other side effect of this is that,
0:38:02 part of the asymmetric warfare
0:38:03 that some of the startups have,
0:38:05 is that if you wanted to kill Geico,
0:38:08 you wouldn’t steal 100% of their customers.
0:38:10 I mean if you did that, that would almost be too obvious,
0:38:12 you’d steal 20% of their customers,
0:38:14 but only the good ones.
0:38:17 So imagine that Geico could actually devolve or evolve,
0:38:19 depending on your point of view, into 10 sub-brands.
0:38:21 There’s no more Geico,
0:38:24 but it’s just like the 10 sub-brands basically select
0:38:25 for the right types of customers,
0:38:29 or even help judge and improve behavior
0:38:31 from other subsets of customers,
0:38:35 and then expel the 30% that are just bad news.
0:38:38 And if you can expel the 30% that are bad news,
0:38:41 you might say okay, well all of this dis-sinergy
0:38:43 of going from one brand into 10 sub-brands,
0:38:45 well that was idiotic.
0:38:46 Because now I have fewer customers,
0:38:47 but actually no it isn’t.
0:38:49 Because you might have fewer customers,
0:38:50 but it’s not like selling widgets.
0:38:52 You’re selling probabilistic widgets,
0:38:55 where in many cases you have negative gross margin
0:38:57 when you sell a widget.
0:38:58 So it’s important to figure out,
0:39:02 how do I get the good ones, keep the good ones,
0:39:04 and then get rid of the bad ones.
0:39:06 – Yeah, so that’s one strategy,
0:39:10 which is sort of sub-brands and sort of customer segmentation.
0:39:13 What if I’ve been told by main management team,
0:39:16 go find a bunch of startups to work with, right?
0:39:18 Sort of somehow figure out a marketing
0:39:23 or co-selling relationship so that we can start experimenting
0:39:24 with some of these new models,
0:39:26 and we can keep an eye on the startup community
0:39:29 so that maybe we can put ourselves in the best place
0:39:31 to buy them if it turns out working.
0:39:33 Is there a way to do that?
0:39:34 – Well there are many ways to do that.
0:39:37 Probably the easiest way that is often counterintuitive
0:39:39 for a lot of big companies is I call this
0:39:41 the turn down traffic strategy.
0:39:44 So Chase turns down a lot of people for loans,
0:39:47 either because, again, it’s the bin and ball problem
0:39:50 where it’s like, well, you might be good, you might be bad.
0:39:51 Sometimes it’s not even that.
0:39:52 It’s like, we think you’re good,
0:39:56 but we just can’t profitably underwrite a $400 loan.
0:39:58 But Chase has all the traffic.
0:39:59 So what is turn down traffic?
0:40:02 It’s saying, okay, we rejected you.
0:40:04 Hey, here’s a friend that you might like.
0:40:05 So this is not cream of the crop.
0:40:08 This is the bottom tier on the ingestion point
0:40:12 for a big financial institution saying we don’t want you,
0:40:14 which kind of is kind of a mean thing to say.
0:40:17 A way to ameliorate that potentially is saying
0:40:19 we don’t want you because we’re not smart enough to,
0:40:22 hey, sorry, we’re working on it.
0:40:23 All our systems are down.
0:40:25 But here’s a great startup that does.
0:40:27 Now why would you send customers to a startup?
0:40:29 Well, the number one thing,
0:40:32 that Geico spends $1.2 billion a year on advertising.
0:40:34 It’s really hard to compete with that from a,
0:40:37 so if I could not spend a dollar of advertising
0:40:42 but give 90% of my net income to Geico as a startup,
0:40:43 I still might make that trade.
0:40:44 I mean, we don’t always like this
0:40:45 because we wanna see do you have
0:40:47 your own acquisition strategies,
0:40:49 your own acquisition channels,
0:40:51 you’re not dependent on the big company.
0:40:52 But from the big company’s perspective,
0:40:55 turn down traffic is often brilliant.
0:40:57 Because it’s saying here’s somebody
0:40:59 that knows how to underwrite better than we do
0:41:01 or more profitably than we do,
0:41:03 we’re going to send our customers,
0:41:05 otherwise what happens?
0:41:07 And this is what I think Amazon got right
0:41:10 in an era where everybody else got this wrong.
0:41:12 Amazon said, okay, you’re on Amazon’s website
0:41:14 and you’re looking at the Harry Potter book.
0:41:17 And then right next to our Harry Potter book
0:41:20 is an ad for Barnes & Noble for the Harry Potter book.
0:41:21 Barnes & Noble is like, this is amazing.
0:41:23 We can buy ads on Amazon’s website.
0:41:24 They’re so stupid.
0:41:26 We’re buying ads and stealing their customers.
0:41:29 But every time you click on that Barnes & Noble ad,
0:41:31 Amazon made a dollar and it’s 100% gross margin.
0:41:32 They share that with nobody.
0:41:33 There’s no cogs on that.
0:41:36 And then they can use that dollar of pure profit
0:41:39 to lower their cost of their Harry Potter book.
0:41:41 Which actually made more people wanna go to Harry Potter,
0:41:43 we’ll go to Amazon to look for Harry Potter
0:41:45 than go to Barnes & Noble that said,
0:41:47 we’re locking you within our walls.
0:41:49 It’s like a casino with no clocks.
0:41:50 And we’re gonna pump oxygen in.
0:41:53 So because what a lot of big companies don’t get
0:41:54 is that Google is just one click away.
0:41:57 Like why give all the excess profits to Google?
0:42:00 When I go to Chase, I get turned down for a loan.
0:42:01 And then I go back to Google and I say,
0:42:03 where else can I get a loan?
0:42:05 Well, Chase should be sending you there.
0:42:06 And actually they’re starting to do this.
0:42:08 So that’s one strategy that I think has a lot of legs.
0:42:09 – Yeah, so turn down traffic.
0:42:10 That’s super interesting.
0:42:13 Look, you spent all the money to bring them to your site
0:42:15 and otherwise you would have just lost them.
0:42:17 That sort of sunk cost.
0:42:19 So you get something out of it.
0:42:20 That’s fantastic.
0:42:22 Well, why don’t we finish this segment out?
0:42:24 I wanna do a lightning round with you.
0:42:26 Which is I want sort of instant advice
0:42:27 for somebody in this seat.
0:42:29 I’m an executive visa or a geico.
0:42:31 And so I’m gonna name a category
0:42:34 and you sort of just have how to deal with startups
0:42:35 and you can react to it.
0:42:38 All right, so category one is
0:42:41 you should always invest super early
0:42:43 as early as you can into a startup.
0:42:45 – So again, remember adverse selection
0:42:46 versus positive selection.
0:42:48 So I would say the competence.
0:42:50 So this is what you have to get right,
0:42:53 which is if you take nine weeks to make a decision
0:42:55 and like, you know, we’ll decide within a day
0:42:56 or if Sequoia or Benchmark
0:42:58 or some other great venture capital firm
0:42:59 will decide within a day.
0:43:01 Like you’re not going to get good deals
0:43:02 if you take nine weeks.
0:43:05 So it can be very, very important to invest early,
0:43:07 but like the best things always seem overpriced.
0:43:09 Like this is something that we’ve learned
0:43:10 and it’s the same thing
0:43:12 with underwriting your own customers,
0:43:14 which is like if something’s too good to be true,
0:43:15 it probably is.
0:43:17 So some of the best things are actually very expensive.
0:43:18 Yeah.
0:43:19 All right.
0:43:21 – Just given those dynamics,
0:43:22 just wait for the later rounds.
0:43:24 Let all the venture guys take all the risk
0:43:26 and then like you plow in late.
0:43:28 That should be my strategy.
0:43:29 – I think in general,
0:43:30 that’s probably a better strategy.
0:43:31 But again, saying like,
0:43:33 ooh, we’re getting a great deal on this one.
0:43:34 That’s probably,
0:43:37 then you know that you’re the adverse selection
0:43:40 a source of capital as opposed to, okay,
0:43:41 here’s something I can’t believe
0:43:43 we’re paying this much money for it.
0:43:44 We have to fight our way in.
0:43:46 There are 10 other people that want it.
0:43:48 You probably know you’re on to a good customer,
0:43:50 if you will, or a good investment.
0:43:54 – All right, partner with as many possible startups
0:43:56 as you can, ’cause you don’t know who’s gonna win.
0:43:58 So let’s open up a marketplace.
0:44:01 Let’s 100 startups that I have either turned on
0:44:03 traffic relationships or something.
0:44:05 – I think that actually that does make sense.
0:44:08 I mean, there should be some kind of gating item
0:44:10 to make sure like maybe not a hundred.
0:44:14 But how do we stay close to different models
0:44:14 that are working well?
0:44:17 Because the main advantage that the incumbents have,
0:44:20 again, depends on like lending or insurance,
0:44:23 but it’s typically something around cost of capital
0:44:25 and something around distribution.
0:44:27 So if you have both of those
0:44:29 and you’re not using it to the fullest extent,
0:44:30 like you turned down a lot of customers,
0:44:33 like you should try to find an intelligent way
0:44:36 of using this and using that’s your unique thing,
0:44:38 like venture capital firms don’t have that.
0:44:40 I can’t fund somebody and send them
0:44:43 a million customers tomorrow, but Geico could.
0:44:45 So, but you can’t do that a hundred times.
0:44:48 You can probably do that some sub-segment of times
0:44:51 according to how much additional traffic
0:44:53 or whatever it is that the unique advantage
0:44:55 that you want to bring to bear.
0:44:58 All right, now on M&A strategy.
0:45:02 M&A strategy one, buy super early
0:45:05 before it’s proven to work
0:45:07 because presumably the prices are lower.
0:45:12 So M&A strategy early, focus on early stage companies.
0:45:14 – I’m a big fan of what Facebook’s done with M&A
0:45:15 and I encourage everybody
0:45:16 and pretty much every other industry to do this.
0:45:18 So Facebook has two formats for M&A.
0:45:22 One is we buy the existential threat that could kill us
0:45:24 and we price it probabilistically.
0:45:28 So surrender 1% of our market cap to buy Instagram.
0:45:29 That was way overpricable.
0:45:31 – Everybody said that.
0:45:32 – Wow, there you go.
0:45:33 – There’s a one in a hundred chance
0:45:35 that this is gonna be bigger than Facebook.
0:45:38 We should probably surrender 1% of our market cap.
0:45:41 WhatsApp, 7% chance or whatever it was.
0:45:43 I think it was 7% of Facebook’s fully diluted market cap
0:45:45 was spent on WhatsApp.
0:45:47 These were brilliant acquisitions.
0:45:48 Oculus, I mean Oculus hasn’t turned out
0:45:50 the same way that WhatsApp has perhaps,
0:45:51 but like same idea.
0:45:52 This could be the new platform.
0:45:54 If we don’t buy this and Apple does,
0:45:57 we are subject to their random whims and fancies.
0:45:58 So that’s category one.
0:46:01 Category two, and this is super counterintuitive
0:46:02 for a lot of companies,
0:46:04 buy the guys that failed trying.
0:46:09 Because they had the courage and the tenacity
0:46:12 to try to go and build something new
0:46:14 and that’s what you want in your company as well.
0:46:17 And then this is the most counterintuitive part.
0:46:18 Is like take the person that failed
0:46:21 and put them in charge of the person that was successful.
0:46:23 And that’s the part that, that’s breaking glass.
0:46:26 – For a big company, that’s so hard.
0:46:28 You reward your execs on success, not on failure.
0:46:30 – Right, but in many cases it’s like you have a big company
0:46:33 that’s been trying to build this thing for 10 years.
0:46:35 And if they build it, they will get one billion customers.
0:46:37 Because they have, or I’m making that up,
0:46:38 they have the distribution.
0:46:40 Then you have the startup that actually built the thing
0:46:41 in like a week.
0:46:43 And they built it for a million dollars.
0:46:45 And that would take the big company like a billion dollars
0:46:47 in 10 years to do.
0:46:48 But like, oh, the company failed.
0:46:50 Oh, that’s a bad company.
0:46:51 These are bad managers.
0:46:53 But actually you want to take them and put them in charge.
0:46:55 And the joke that I would always make is like,
0:46:59 if Amtrak buys Tesla, the worst thing that Amtrak could do,
0:47:01 because Amtrak is probably more profitable
0:47:02 than Tesla at this point.
0:47:04 But if Amtrak were to buy Tesla,
0:47:06 the worst thing they could do is say, okay,
0:47:09 all of you Tesla bozos, you work for us.
0:47:12 But the whole point of a lot of this other form of M&A
0:47:14 is you’re really trying to buy products
0:47:16 that you can push into your distribution.
0:47:19 And you’re trying to buy talent that wrote the products,
0:47:20 that built the products, that understand that.
0:47:22 And the only thing that they needed,
0:47:25 the only gap between them and actual huge success
0:47:27 is distribution, which these big companies have in droves.
0:47:30 – Yeah, so that makes perfect sense.
0:47:32 Maybe just a piece of advice
0:47:34 on how to actually make that happen.
0:47:36 ‘Cause you have this dynamic where you’re a big company,
0:47:39 you just bought a failing startup, right?
0:47:41 You have all of the execs inside
0:47:43 that have earned bonuses consistently over a year
0:47:45 for awesome performance, right?
0:47:46 You’ve rewarded success.
0:47:47 And now you’re gonna say,
0:47:50 I’m gonna take this guy that kind of failed.
0:47:52 And like, you work for them.
0:47:53 – Right.
0:47:55 – Like, oh, that’s hard to do inside a big company.
0:47:56 – It’s very hard.
0:47:58 But I mean, in some cases, you just wanna do it early.
0:48:00 I mean, I think it actually, where it works best
0:48:02 is where you say, we need this product.
0:48:03 – Yeah.
0:48:03 – We need this product to exist.
0:48:04 We don’t have it right now.
0:48:06 We haven’t spent eight years trying.
0:48:10 Rather than saying, let’s go assemble a team
0:48:11 and I’m gonna rely on like something
0:48:12 that’s just not in our core DNA,
0:48:14 here’s how we’re gonna go shopping.
0:48:16 We’re not gonna go shopping and value this.
0:48:18 And again, we don’t, this is not a self-serving comment
0:48:20 because if somebody buys one of our failing companies
0:48:22 for $10 million and we have a billion dollar fund,
0:48:23 it doesn’t matter, right?
0:48:26 Like we want the companies that actually beat the incumbents.
0:48:28 But the incumbents, the way that they can actually do great
0:48:30 is to adopt more of this Facebook mentality.
0:48:35 And like, the key thing is that many of these acquisitions,
0:48:36 these kind of aqua hire,
0:48:39 that’s the portmanteau of acquire and hire,
0:48:41 these aqua hire acquisitions that Facebook made,
0:48:45 these people now run big swaths of Facebook.
0:48:47 So I agree, it’s hard to do
0:48:49 if you already have a leader in place.
0:48:50 In that case, it just requires
0:48:52 a very strong-willed leadership team
0:48:55 and an actual overt strategy that this is what we do.
0:48:57 It becomes easier if it’s like, okay,
0:48:58 we’re trying to do this new thing
0:49:00 rather than assemble our own team
0:49:01 and they don’t know what they’re doing
0:49:02 but they’re well-intentioned.
0:49:05 Let’s go buy a company, but let’s buy a company
0:49:06 that hasn’t already done the thing,
0:49:10 but a company that tried and failed to do the thing.
0:49:12 But we’re pretty sure that these are the best triers
0:49:14 and failures in the business.
0:49:17 That’s the hard thing to really measure
0:49:19 because most people are used to measuring outcomes
0:49:20 and not process.
0:49:23 And the key thing to make this strategy work
0:49:26 is you actually want to over-allocate on process
0:49:28 and you want to wait outcome to almost zero
0:49:30 because you’re buying the outcomes that were in fact zero.
0:49:34 – Yeah, the mark is about to interview
0:49:37 Andy Duke, the thinking in bets, right?
0:49:39 And this is sort of the essential thinking in bets motion,
0:49:41 right, which is don’t confuse a bad outcome
0:49:45 with sort of a bad bet, right, right, exactly, awesome.
0:49:46 Well, thank you so much, Alex,
0:49:48 for coming in and sharing your thoughts.
0:49:50 For those of you in YouTube land,
0:49:52 please like and subscribe.
0:49:54 And for the comments thread on this,
0:49:57 I’d love to get your input on what you thought
0:50:01 of Alex’s idea that what you really should do
0:50:03 is not go after more customers,
0:50:07 but instead go after only the best customers.
0:50:09 So what are examples that you’ve been trying
0:50:11 in your own startup where you’re trying
0:50:14 to implement that idea?
0:50:16 So see you next time.
0:50:18 Go ahead and subscribe to the channel if you like it
0:50:20 and see you next episode.

In this episode of the a16z Podcast — which originally aired as a video on YouTube — general partner Alex Rampell (and former fintech entrepreneur as the CEO and co-founder of TrialPay) talks with operating partner Frank Chen about the quickly changing fintech landscape and, even more importantly, why the landscape is changing now.

Should the incumbents be nervous? About what, exactly? And most importantly, what should big companies do about all of this change? But the conversation from both sides of the table begins from the perspective of the hungry and fast fintech startup sharing lessons learned, and then moves to more concrete advice for the execs in the hot seat at established companies.


The views expressed here are those of the individual AH Capital Management, L.L.C. (“a16z”) personnel quoted and are not the views of a16z or its affiliates.

This content is provided for informational purposes only, and should not be relied upon as legal, business, investment, or tax advice. You should consult your own advisers as to those matters. References to any securities or digital assets are for illustrative purposes only and do not constitute an investment recommendation or offer to provide investment advisory services. Furthermore, this content is not directed at nor intended for use by any investor or prospective investor, and may not under any circumstances be relied upon when making a decision to invest in any fund managed by a16z. (An offering to invest in an a16z fund will be made only by the private placement memorandum, subscription agreement, and other relevant documentation of any such fund which should be read in their entirety.)

Past performance is not indicative of future results. Any charts and graphs provided within are for informational purposes solely and should not be relied upon when making any investment decision.

Please see https://a16z.com/disclosures for additional important information.

Leave a Comment