AI transcript
0:00:04 I’m Hannah.
0:00:06 Good data, bad data, there’s maybe no other area
0:00:09 where understanding what the evidence actually tells us
0:00:11 is harder than in health and parenting.
0:00:14 In this episode, economics professor Emily Oster,
0:00:16 author of “Expecting Better”
0:00:18 and the recently released “Cribsheet,”
0:00:21 a data-driven guide to better, more relaxed parenting,
0:00:23 does just that, looking at the science and the data
0:00:25 behind the studies we hear about
0:00:27 and make decisions based on in those worlds.
0:00:30 From whether to breastfeed your child to screen time
0:00:32 to sleep training, we talk about what it means
0:00:35 to make database decisions in these settings,
0:00:37 in diet and in health and in life,
0:00:39 like whether chia seeds are actually good for you
0:00:42 and how we can tell what’s real and what’s not.
0:00:44 We also talk about how guidelines and advice like this
0:00:47 gets formalized and accepted for better or for worse
0:00:49 and how they can or can’t be changed.
0:00:52 And finally, how the course of science itself
0:00:55 can be changed by how these studies are done.
0:00:57 – You describe yourself as teasing out causality
0:00:58 in health economics.
0:01:01 Can you give us a little primer on what exactly that means
0:01:02 and how you start going about doing that?
0:01:04 – So there are a lot of settings in health
0:01:06 and in all of those settings,
0:01:09 we have to figure out what does the evidence say.
0:01:11 And I think about some of them in this context of parenting,
0:01:14 but you can think about even questions like,
0:01:15 is it a good idea to eat eggs
0:01:17 or is it a good idea to take vitamins,
0:01:19 other kinds of health decisions.
0:01:20 And you can sort of think about there being
0:01:23 kind of two types of data you could bring to that.
0:01:25 One would be randomized data.
0:01:26 So you could run a randomized trial
0:01:29 in which half of the people got eggs
0:01:30 and half of the people didn’t.
0:01:31 And you followed them for 50 years
0:01:33 and you saw which of them died.
0:01:36 And that would be very compelling and convincing.
0:01:38 And when we have data like that, it’s really great.
0:01:40 – I mean, I kind of think of that as being the default.
0:01:42 No, is that not at all the standard?
0:01:45 – That is the gold standard, but it is not the default.
0:01:47 So many of the kinds of recommendations
0:01:48 that I look at in parenting,
0:01:50 but that you look at in general in health
0:01:52 are based on observational data,
0:01:54 which is the other kind where we compare people
0:01:56 who do one thing to people who do another thing
0:01:58 and we look at their outcomes.
0:02:00 And one of the ways in which the people differ
0:02:02 is on the thing that you’re studying,
0:02:06 but of course there are other ways that they may differ also.
0:02:06 – A million other ways.
0:02:08 – A million other ways, yes.
0:02:10 And data like that is really subject
0:02:12 to these kind of biases that the kind of people
0:02:13 who make one choice are different
0:02:16 from the kind of people who make another choice.
0:02:17 One of the things that’s very frustrating
0:02:19 in a lot of the health literature
0:02:21 is that there isn’t always that much effort
0:02:24 to improve the conclusions that we draw
0:02:25 from those kind of data.
0:02:27 – And we’re using that kind of approach
0:02:31 because of the inability to have long longitudinal studies,
0:02:33 or it does tend to be a shortcut.
0:02:35 – So I think it is both things.
0:02:38 So it is much easier, faster to write papers,
0:02:40 to produce research about that,
0:02:42 and it can be really useful for developing hypotheses.
0:02:43 – So it’s like a scratch pad almost.
0:02:46 – In the best case scenario, it’d be like a scratch pad.
0:02:48 Like let’s just look in the data
0:02:50 and see what kinds of things are associated
0:02:52 with good health or associated with good outcomes for kids.
0:02:54 And then we could imagine a next step
0:02:58 where you would analyze more rigorous gold standard.
0:02:59 And sometimes that happens.
0:03:01 So there’s one really nice example of the book
0:03:03 where this happens exactly like you would hope,
0:03:06 which is in studying the impact of peanut exposure
0:03:07 on peanut allergies.
0:03:10 So the first paper on that is written by a guy,
0:03:12 and what he did was he just compared Jewish kids
0:03:16 in the U.K. to Jewish kids in Israel,
0:03:18 and he saw that the kids in Israel
0:03:19 were less likely to be allergic to peanuts,
0:03:22 and he said that’s because they eat this peanut snack
0:03:24 when they’re being as a bomba.
0:03:28 And so then that’s like the hypothesis generation,
0:03:30 and then he went and did the thing you would really like,
0:03:32 would just say, okay, let’s run a randomized trial,
0:03:34 and let’s randomly give some kids early peanuts,
0:03:35 and some kids not.
0:03:37 And indeed, like he found that he was right.
0:03:39 So that’s like a great example
0:03:42 of like how you would hope that literature would evolve.
0:03:44 But in many of the kinds of health settings
0:03:47 we’re interested in, that you can’t do that,
0:03:49 or it is much harder to do that,
0:03:52 because the outcomes would take a long time to realize,
0:03:54 or it’s expensive, or it’s hard to manipulate
0:03:55 what people are doing.
0:03:57 And then we often end up relying
0:04:00 on these more bias sources of data
0:04:02 to draw our conclusions, not just as a scratch pad.
0:04:06 And I think that’s where we encounter problems.
0:04:08 – That’s where it gets murky,
0:04:10 and we never know whether we should eat eggs or not.
0:04:13 Yeah, and that’s exactly the area that you tend to focus on.
0:04:14 – Yeah, exactly.
0:04:16 I try to first see are there good pieces of data
0:04:19 that we can use, and then if we’re stuck with the data
0:04:20 that isn’t good, trying to figure out
0:04:25 which of the murky studies are better than others.
0:04:27 And what would you mean by better?
0:04:30 Well, it’s roughly like how good is this study
0:04:34 at controlling or adjusting for the differences across people?
0:04:38 – So you talk about kind of breaking it down
0:04:42 into both into the relationship between data and preference.
0:04:44 How do you factor in that in the healthcare system
0:04:46 where it’s so diverse, where preference
0:04:48 has such an incredible effect
0:04:51 and puts you into so many different possibilities?
0:04:53 – I think this is why in these spaces,
0:04:55 decision-making should be so personal.
0:04:59 We often run up in health and also in parenting
0:05:01 and all of these spaces into a place
0:05:04 where we’re telling people like there’s a right thing
0:05:06 there’s a right thing to do.
0:05:10 And I think that that can be problematic
0:05:12 because it doesn’t recognize this difference
0:05:14 in preferences across people.
0:05:16 – You have to basically accept the variety
0:05:17 in the system and then give a space
0:05:18 for preference in the decision-making.
0:05:20 – Yeah, but I think it’s exactly these preferences
0:05:22 that of course make it hard to learn
0:05:24 about these relationships in the data.
0:05:26 ‘Cause once you recognize that a lot of the reason
0:05:28 that some people choose to eat eggs
0:05:29 and some people choose to eat cocoa crispies
0:05:31 is that some people really like cocoa crispies
0:05:32 and some people really like eggs.
0:05:34 How can you ever learn about the impact of eggs
0:05:37 because we know there must be differences across people.
0:05:39 And I think that that becomes even more extreme
0:05:41 when we think about really important decisions
0:05:42 that people are making,
0:05:44 like the kinds of choices they make in parenting
0:05:45 or also in their diets.
0:05:48 – So can you walk us through one example like that
0:05:51 of where it was a really kind of murky gray area
0:05:53 and how you pull out the causality?
0:05:54 – The best example of this in the data
0:05:57 in the parenting space is probably in breastfeeding.
0:05:59 Let’s say you wanna know the impact of breastfeeding
0:06:00 on obesity in kids.
0:06:02 That’s a thing which you hear a lot.
0:06:06 Breastfeeding is a way to make your kid skinny and so on.
0:06:09 And so the basic way you might analyze that
0:06:12 is to compare kids who are breastfed to kids
0:06:13 who are not and look at their obesity
0:06:14 when they’re say seven or eight.
0:06:16 And indeed, if you do that,
0:06:17 you will find that the kids who are breastfed
0:06:20 are less likely to be obese than the kids who are not.
0:06:24 But you will also find that there’s all kinds of relationships
0:06:27 between obesity and income and obesity,
0:06:28 mother’s income and mother’s education
0:06:30 and other things about the family.
0:06:32 And those things correlate with breastfeeding
0:06:34 and they also correlate with obesity.
0:06:36 – So you can’t really pull apart this web.
0:06:38 – So it’s hard to pull apart the web.
0:06:39 So I would say this is an example
0:06:42 where the data is suggestive.
0:06:43 It would certainly be consistent
0:06:45 with an effect of breastfeeding on obesity,
0:06:48 but I think it doesn’t prove an effect.
0:06:50 And then you can sort of take the next step
0:06:52 and say, okay, well, do we have any data that’s better?
0:06:53 And in that example,
0:06:55 we do have one kind of randomized data.
0:06:57 But again, we run up against the limits
0:06:59 of all kinds of evidence.
0:07:02 So the randomized data on this question
0:07:05 is from a randomized trial that was run in Belarus
0:07:07 in the 1990s.
0:07:09 They randomly encourage some moms to breastfeed
0:07:09 and some moms not.
0:07:11 And so there’s a lot of good things
0:07:12 that we can learn from that.
0:07:14 – But such a specific place in time.
0:07:15 – Exactly, it’s so specific.
0:07:16 And you said like, well, you know,
0:07:21 how do I take that result to the Bay Area in 2019?
0:07:23 It’s a challenge.
0:07:24 Okay, well, is there any other within this space
0:07:28 of not randomized datas or anything that’s better?
0:07:29 And in that case, there is,
0:07:32 there are some studies that like compare siblings,
0:07:35 where you look at two kids born to the same siblings,
0:07:36 born to the same mom.
0:07:39 One of whom was breastfed and one of whom was not.
0:07:40 And then look at their obesity rates.
0:07:41 And when you do that,
0:07:43 you find there’s basically no impact.
0:07:46 So then you’re kind of holding constant like who’s the mom.
0:07:49 So if you’re a worry was that there are differences
0:07:51 across parents in their choices to breastfeed,
0:07:53 well now you’re looking at the same parent.
0:07:54 – Right, you’re normalizing.
0:07:55 – You’re normalizing.
0:07:57 And so you may think, oh, that’s great, perfect.
0:07:57 I’m totally done.
0:07:59 But of course you’re not,
0:08:02 this isn’t perfect because why did the mom choose
0:08:03 to breastfeed one kid and not the other?
0:08:04 – People are not choosing random.
0:08:05 – You had to see section one time,
0:08:07 you didn’t another time.
0:08:08 – If that were the reason that would be great, right?
0:08:11 If the reason were just like kind of worked one time,
0:08:12 didn’t work the other time.
0:08:14 If there was something that was effectively
0:08:16 a little bit random,
0:08:19 then that would be exactly the kind of variation
0:08:21 you’d wanna use.
0:08:23 But the thing you worry about is like one kid
0:08:25 is not doing well, is unhealthy.
0:08:26 So the mom chooses not to breastfeed
0:08:30 or chooses to breastfeed to try to make them healthier.
0:08:32 Those are the kind of things where there’s some other reason
0:08:34 that they’re choosing differences in breastfeeding
0:08:36 which has its own effect on the kid’s outcomes.
0:08:41 So you kind of like some of what I try to do in the book
0:08:43 is sort of like put all of these pieces together
0:08:46 and kind of like look at them
0:08:50 and think about them all as a sort of totality of evidence
0:08:52 and just think like how compelling is this altogether?
0:08:55 – It sounds almost like sifting, like using a sifter.
0:08:58 You take all this very murky data,
0:09:00 very variable from all sorts of different contexts
0:09:02 and like put it through the sifter of like
0:09:03 this kind of data, this kind of data
0:09:06 and then match it all up and say, okay, what do we have left?
0:09:08 And then therefore, and then hand that over and say,
0:09:11 and now you make the decision based on this.
0:09:12 – Based on this.
0:09:14 – Right, here’s kind of what we can be more or less
0:09:15 or less sure about.
0:09:17 – You talk a little bit about the idea
0:09:20 of constrained optimization as being very important.
0:09:23 Can you explain what that means and how that plays out?
0:09:25 – In economics, we think about people
0:09:27 optimizing their utility function.
0:09:28 The idea is that you have a bunch of things
0:09:30 that make you happy, that’s your utility.
0:09:33 They produce your utility and you want to make the choices
0:09:35 that are going to optimize your utility.
0:09:39 They’re going to give you the most amount of happiness points,
0:09:40 eudals, eudals.
0:09:44 It’s really, it’s a very warm and fuzzy.
0:09:46 – Yeah, I feel like I’m gonna go home and use that.
0:09:47 – Absolutely.
0:09:49 – Like you gave me some eudals today.
0:09:53 – But we also recognize that people have constraints.
0:09:54 In the absence of constraints,
0:09:58 like having money to buy things or time to do them,
0:10:00 people would just have an infinite amount of stuff.
0:10:02 That’s the thing that would make them the most happy.
0:10:04 And so, but when you’re actually making choices,
0:10:07 you’re constrained by either money or time.
0:10:09 And in the book, I talk a lot about this
0:10:12 in the context of time, that you’re as a parent,
0:10:15 you’re making choices, and you have some preferences
0:10:16 and things you would like to do,
0:10:19 but you are also facing some constraints.
0:10:22 – But is there, is information flow kind of what,
0:10:25 and the data itself a constraint in that regard?
0:10:27 Is that a, because it’s so piecemeal,
0:10:29 the information you get.
0:10:30 That feels almost totally random.
0:10:32 Like some media story picks up on something,
0:10:34 you tend, you know, some tidbit, you hear some,
0:10:36 unless you’re like systemically
0:10:38 studying a graduate seminar on parenting,
0:10:41 which none of us do, you know, then it is random.
0:10:44 – Yeah, and I think we wouldn’t necessarily think of that
0:10:46 as in constraints, because of course in our models,
0:10:49 people are fully informed about everything all the time.
0:10:52 That’s one of the great things about the models.
0:10:52 – But in real life?
0:10:55 – But in real life, yeah, I think people face constraints
0:10:58 associated with just not having all the information.
0:11:01 And, you know, also the fact that this,
0:11:04 these kind of information, like whipsaws over time,
0:11:07 that you know, you get one piece and then you kind of,
0:11:10 the next day there’s a different piece of information
0:11:12 and we have a tendency to kind of
0:11:15 a glom onto whatever is the most recent thing
0:11:16 that we have seen about this,
0:11:19 as opposed to what is the whole literature
0:11:21 over this whole period of time say.
0:11:24 – Right, you say, you have a great quote where you say,
0:11:26 in confronting the questions here,
0:11:28 we also have to confront the limits of the data
0:11:29 and the limits of all data.
0:11:30 There’s no perfect studies,
0:11:33 so there will always be some uncertainty about conclusions.
0:11:35 The only data we have will be problematic.
0:11:37 There will be a single not very good study
0:11:39 and all we can say is that this study
0:11:41 doesn’t support a relationship.
0:11:44 So it feels kind of hopeless.
0:11:46 I loved when you talked about the first three days
0:11:47 of when you brought Penelope home
0:11:50 and it really brought that back for me
0:11:52 as I was just this dark room
0:11:54 that you’re kind of alone making these decisions.
0:11:57 How do you even begin to see this data,
0:12:00 you know, as a decision making practice?
0:12:02 Like how does that translate?
0:12:04 – There are pieces where it’s easier,
0:12:08 where the data is better and makes it is clearer
0:12:10 about what you need to do or what the choices are.
0:12:12 You will be making many choices
0:12:17 without the benefit of evidence or data or very good data.
0:12:19 I think part of what makes some of this parenting so hard
0:12:21 is that for those of us who like, you know,
0:12:24 evidence and facts and it’s hard to accept,
0:12:26 I’m just going to have to make this decision
0:12:29 basically based on what I think is a good idea.
0:12:30 – Based on my gut.
0:12:31 – Based on my gut.
0:12:36 – And, you know, maybe based on my mom and, you know.
0:12:37 – Which is a sample size of one.
0:12:39 – A sample size of one.
0:12:41 And, you know, maybe if you have like a mother-in-law
0:12:43 and father-in-law, it’s like a sample size of two,
0:12:46 but that’s kind of, that’s kind of it.
0:12:48 And I think that that’s really scary,
0:12:50 especially when the choices seem so important.
0:12:52 – Yeah, I mean, but it feels like, you know,
0:12:54 that’s kind of at heart what you’re trying to do, right?
0:12:56 Is like to translate and to give tools
0:12:58 in this decision making place.
0:13:02 So how would you begin to systematize that?
0:13:06 I mean, is there a way to bridge that gap better
0:13:07 in the system?
0:13:09 – I think that it would be helpful
0:13:12 if more information was shared.
0:13:14 So I think a lot of these things,
0:13:17 there is a lot of information that is contained
0:13:21 in people’s experiences that we are not using
0:13:23 in our evidence production.
0:13:27 So in the book I talk about like the sleep schedule, right?
0:13:30 So you’re sort of told as a parent, like, oh, you know,
0:13:32 this is kind of roughly like around six weeks,
0:13:35 your kid will start sleeping like longer at night,
0:13:38 but there’s no, the information that’s sort of
0:13:42 typically conveyed to people is not a range.
0:13:44 It’s just like around six weeks-ish,
0:13:46 you know, that’ll start to happen.
0:13:48 But the truth is like, yeah, that’s kind of right,
0:13:51 but it’s, if you look at data on when that actually happens,
0:13:54 it’s pretty, it’s a pretty wide range.
0:13:57 And I think part of what is so stressful
0:14:00 about this early, these like early parts of parenting
0:14:03 are that it’s very hard to understand
0:14:06 whether what you’re experiencing is like normal.
0:14:08 And I think if you could understand, like, yeah,
0:14:11 most kids don’t do this thing at this time
0:14:13 or most parents have this experience or-
0:14:15 – The way the graph plots kind of.
0:14:16 – Yeah. – A little more broadly.
0:14:17 – Exactly.
0:14:19 What is, I think that would be, that would be super helpful.
0:14:21 And that’s a place where I can imagine,
0:14:23 you know, data collection helping, right?
0:14:27 You know, we have a much more of an ability at this point
0:14:30 to like get information about what is happening
0:14:33 with our kids, what’s happening with, you know, with our health.
0:14:35 There is a sense in which that could be helpful
0:14:39 in just setting some norms for the normal,
0:14:42 the standard variation across people.
0:14:45 – So looking at the variation and providing that
0:14:46 as like a piece of the information.
0:14:47 – As a piece of the information.
0:14:48 – Here’s also the variation on that.
0:14:49 – Yeah. – Yeah.
0:14:51 – Yeah, and I think that is kind of part of like
0:14:53 generating the uncertainty and sort of showing people
0:14:55 like what are the limits of the data, right?
0:14:58 That how sure are you that this should happen at this time?
0:15:00 – Not just how sure, but like what are some of the
0:15:02 like other ends of the curve?
0:15:04 I mean, that’s just information you just don’t get.
0:15:05 – Yeah, you just don’t get, right.
0:15:07 – So let’s zoom out a little bit
0:15:09 as somebody who lives in the world deeply of data
0:15:11 in the health system.
0:15:14 We’re in a time of enormous shift, right, for data.
0:15:17 Does the improvement, does our kind of the sea of data
0:15:20 and like better data, cleaner data, more granular data,
0:15:23 all that help this at all, this question?
0:15:28 – Yeah, and I think we are collecting so much data
0:15:32 on people, both sort of individual people
0:15:34 are collecting a lot of data about themselves.
0:15:37 Health systems are collecting a lot of data about people.
0:15:40 This data is like underutilized, I think.
0:15:41 We’re amassing pools of it,
0:15:43 but not in ways that are especially helpful.
0:15:45 So, you know, when I go to conferences
0:15:47 and people who work on healthcare,
0:15:48 like there’s a tremendous amount of data
0:15:50 that’s being used on health claims, right?
0:15:51 So if you sort of think about like,
0:15:53 what are some kinds of data that we have?
0:15:55 We have like health claims data, like payments,
0:15:56 everything that we’re,
0:15:58 where there’s an individual payment for it,
0:16:00 we’ll like see, we’ll see it.
0:16:02 There’s almost no work with medical records.
0:16:05 Even though every hospital, everybody’s using Epic,
0:16:08 you would think that that would make it straightforward
0:16:11 to have that data in a usable form, but it’s not.
0:16:12 And, but you know, at the same time,
0:16:16 the potential for sort of going beyond like,
0:16:18 here is all the tests that you ordered
0:16:21 into actually like, what happened with those tests?
0:16:23 And then what happened to this person later?
0:16:27 Like that data is not being mined in the way
0:16:30 that we could to try to look at some, you know,
0:16:33 at some of the kinds of outcomes that are a result.
0:16:36 – The causality that you would pull out afterwards.
0:16:37 – Yeah, absolutely.
0:16:39 You know, how can we improve our causality?
0:16:40 More data is helpful.
0:16:42 More information about people is helpful.
0:16:43 Being able to look at, you know,
0:16:45 the timing relationship between some treatment
0:16:47 and some outcome, those are all the kinds of things
0:16:50 that, you know, having better data would help us,
0:16:51 would help us do.
0:16:53 – Are there other areas where you start,
0:16:56 you are starting to see the data coalesce in a way
0:16:59 where you’re able to pull meaningful insights from it?
0:17:01 – So I think, yes, you know, when we have better data,
0:17:06 we can use better tools, even if we don’t have randomization.
0:17:09 A classic example in health is looking at the impacts
0:17:11 of like a really advanced neonatal care.
0:17:14 Like how cost effective is it to have like,
0:17:16 you know, kids in sort of getting like,
0:17:17 really extensive NICU care?
0:17:21 Like how effective is that in terms of improving survival
0:17:22 and how much does it cost?
0:17:23 – No, such a basic question.
0:17:27 – Such a basic question, and super hard to imagine analyzing
0:17:29 because of course, you know, babies that are very small
0:17:32 and are sick cost more but also have worse outcomes.
0:17:34 And so if you sort of looked at that,
0:17:36 you would be like, well, actually like spending more,
0:17:38 we’re not getting anything because those babies
0:17:41 are more likely to die than babies that are spending less.
0:17:46 We define very low-birthway babies as less than 1500 grams,
0:17:48 which means that the treatment that you get
0:17:51 if you’re a baby at 1500 and three grams
0:17:53 is very different than the treatment that you get
0:17:56 as a baby at 1497 grams, which is completely arbitrary.
0:17:59 I mean, the choice of 1500 grams has nothing to do with science.
0:18:00 – It’s like this line in the sand.
0:18:02 – That’s not a good way to set policy.
0:18:04 However, having set the policy like that,
0:18:07 you can then say, okay, well now we have some babies
0:18:08 that are almost exactly the same,
0:18:11 but the babies that are a little bit lighter
0:18:12 that are like 1497 grams
0:18:14 get all kinds of additional interventions
0:18:17 relative to the babies that are 1500 and three grams.
0:18:18 And when people have done that,
0:18:21 they see actually the babies at 1497 grams do better.
0:18:24 – So the line actually is beneficial in that way
0:18:26 because you’re defining these two groups very closely.
0:18:28 – Oh, interesting.
0:18:29 – Setting this line in this arbitrary way
0:18:31 lets you get at some causality.
0:18:32 – Even though not good for the babies.
0:18:34 – Sort of having done it good for research.
0:18:35 – Good for information.
0:18:36 Interesting.
0:18:38 What are some of the other tools?
0:18:39 Are there others in that list?
0:18:44 – So that’s an example called regression discontinuity,
0:18:47 that there’s some discontinuous change in policy
0:18:49 on either side of a cutoff.
0:18:52 And that has become a sort of part of a big toolkit
0:18:55 of things people are using more of.
0:18:57 The other is to look at sort of sharp changes
0:19:01 in policies at a time, at like a moment in time.
0:19:03 – Oh, so the same thing at the same time.
0:19:05 – Then there’s that and then there’s looking across
0:19:07 when different policies change differently
0:19:08 for different groups.
0:19:13 So, all of these things have become easier with more data
0:19:15 and become more possible with more data.
0:19:16 And I think that that has improved our inference
0:19:19 in some of these settings.
0:19:21 – I love that you talked a little bit about the experience
0:19:24 of doing your own data collection kind of in the wild
0:19:27 with this spreadsheet after Penelope was born,
0:19:27 which made me laugh so much
0:19:30 ’cause it was so much like my spreadsheet.
0:19:33 It was just so sad to think of like all these moms alone
0:19:35 in their bedrooms at night.
0:19:35 – I know, I know.
0:19:40 I mean, I think there’s been a lot more apps
0:19:44 since like we had that help, yeah.
0:19:48 But still, I love that you said
0:19:52 it gives the illusion of control, not control.
0:19:55 And in that particular, in these kinds of like data vacuums,
0:19:58 like if we’re not good at statistical analysis
0:20:01 or like pulling out causality from these murky areas,
0:20:03 like if we’re not Emily Oster basically,
0:20:05 how do you like, or even if you are,
0:20:08 how do you kind of stay on that line
0:20:10 of like the illusion of control
0:20:11 versus like actual knowledge
0:20:14 that like impacts real decision-making?
0:20:15 – No, I think it’s super hard
0:20:17 because the thing is the illusion of control
0:20:19 is a very powerful illusion.
0:20:20 – Very, yeah.
0:20:24 – And both empowering and dangerous in health context.
0:20:25 – Exactly.
0:20:26 Like we would, you know, you would sort of,
0:20:29 we like people to feel like they’re in control.
0:20:30 Some of the message of this book,
0:20:31 I think people have taken, not quite right,
0:20:33 but to say like, well, it doesn’t really matter
0:20:36 what choices you make, like all choices are good choices.
0:20:38 I think there’s, that’s not quite the right,
0:20:39 it’s not quite the message.
0:20:41 – That’s, I’m surprised that’s the message
0:20:42 that people take from this.
0:20:43 – Occasionally.
0:20:45 – There are a lot of different good choices
0:20:47 that you could make about parenting.
0:20:50 And so I think that there is a piece that like,
0:20:54 we maybe don’t need to be so like obsessive
0:20:55 about all of these.
0:20:55 – About one of those.
0:20:57 – About one of those, any one of those,
0:20:58 of those choices.
0:20:59 – What’s your point about range?
0:21:02 It’s like, well, let’s educate a little bit more
0:21:03 about like the spectrum of possibilities.
0:21:05 – Spectrum of good, of good choices.
0:21:06 – Yeah.
0:21:09 Another area I feel like where every other day
0:21:11 there’s a new study that says something different.
0:21:14 And it feels like there’s a plethora of studies
0:21:15 is screen time.
0:21:17 I’m just, I’m gonna put that out there right now.
0:21:18 I’m sorry.
0:21:21 Everybody, we’re gonna touch that third row.
0:21:22 – Three times.
0:21:24 – So can you walk us through,
0:21:28 like can you help guide us through some of that maze?
0:21:29 – So when I looked into screen time,
0:21:31 I had always thought about like screen time
0:21:32 is like bad.
0:21:35 Like it’s like, the question is, is it bad or not?
0:21:37 But actually there’s like a whole other side of this,
0:21:38 which is some people like screen time
0:21:40 is the way to make your kid smart.
0:21:42 Like you can, like your baby can learn from that.
0:21:42 – Okay.
0:21:44 So point number one is what does screen time
0:21:45 actually mean?
0:21:45 – Right.
0:21:46 – Which is a bunch of different stuff.
0:21:47 – Yeah.
0:21:48 And I think that’s part of the,
0:21:49 like that’s part of the problem with this is like,
0:21:51 when you say screen time, like what do you mean?
0:21:52 – Yeah.
0:21:54 – Do you mean like, you know, educational apps?
0:21:55 – Yeah.
0:21:56 – Do you mean…
0:21:57 – Do you mean Sesame Street?
0:21:57 – Sesame Street?
0:21:58 – Where you like jump in the shower?
0:21:59 – Yeah.
0:22:01 – Or yeah, and that point like while you jump in the shower,
0:22:03 like what is the other thing you’re going to be doing
0:22:04 with your time?
0:22:07 I think this is where all of these recommendations
0:22:11 seem to assume that the alternative use of your,
0:22:13 like if your kid wasn’t watching Sesame Street,
0:22:15 you would be like on the floor,
0:22:16 like playing puzzles with them
0:22:18 and like super engaged with them.
0:22:19 Which like, maybe is true.
0:22:21 – Taking them to the zoo and like having them touch
0:22:22 different textures of animal skins
0:22:25 or whatever like sensory development, yeah.
0:22:26 – Yeah, which like is great stuff
0:22:28 that you should definitely do with your kid.
0:22:30 But some of the time when, you know,
0:22:32 when our kids are watching TV,
0:22:34 it’s cause like we, it’s,
0:22:35 that maybe isn’t the thing
0:22:36 that you would otherwise be doing.
0:22:37 – Yeah.
0:22:39 – You could be like purring healthy vegetables
0:22:40 to like feed them well.
0:22:42 – Yeah, exactly.
0:22:43 I’m sure that’s what we’re all doing.
0:22:47 – Or maybe watching a little reality TV
0:22:49 for five minutes while you fold laundry.
0:22:51 – You look a little bit of called a midwife,
0:22:53 you know, got a little bit of, yeah.
0:22:54 – The problem with screen time is that the evidence
0:22:56 is very, is very poor.
0:22:58 – Can you just break up like why the evidence
0:22:59 is so poor?
0:23:00 Because this does seem like an area
0:23:02 where there should have been time
0:23:04 for that kind of gold standard randomized study
0:23:05 that to develop.
0:23:07 No, what is the evidence problem?
0:23:09 – So the, I think the evidence problem is twofold.
0:23:11 One, it’s actually not a super easy thing
0:23:14 to run a randomized trial on
0:23:15 because these are choices
0:23:17 that people are thinking a lot about.
0:23:19 And, you know, think about something like an iPad.
0:23:20 Like, do you want to be involved
0:23:23 in a randomized trial of whether you’re a kid?
0:23:24 – Oh, there’s too much intention,
0:23:25 too much at stake.
0:23:26 – Too much attention, exactly.
0:23:28 – Too much like lifestyle stuff.
0:23:30 Some people have been able to use like the introduction
0:23:33 of TV, which was sort of had some random features
0:23:34 to like look at the impacts of TV.
0:23:36 And that evidence is sort of reassuring
0:23:37 and suggested TV is okay.
0:23:38 But of course it’s very old.
0:23:40 It’s like from the fifties.
0:23:42 – A whole different way of consuming everything.
0:23:43 – Yeah.
0:23:44 – And I think the other thing is,
0:23:46 the other problem with the sort of current,
0:23:48 answering the current questions people want,
0:23:50 like what about iPads, what about apps, you know,
0:23:52 is that they just haven’t been around long enough.
0:23:54 So a lot of the kinds of outcomes you would want to know
0:23:58 that even things like short run, like test scores,
0:24:02 you know, I got the first iPad when my daughter was born.
0:24:03 Like that was like one,
0:24:04 and I remember getting in being like,
0:24:06 this is never going to catch on.
0:24:09 This is why I’m not in tech.
0:24:10 I was like, who would use this?
0:24:12 – I mean, while your daughter’s like swiping.
0:24:14 – I mean, while you’re just like, okay.
0:24:17 But you know, now she’s in second grade.
0:24:20 Like that’s kind of the earliest that you could kind of
0:24:22 imagine getting some kind of,
0:24:24 what we’d measured test scores or something like that.
0:24:26 But even, you know, she didn’t use the iPad anywhere
0:24:30 near as like facile away as my four year old, right?
0:24:32 This is evolving so quickly
0:24:35 that any kind of even slightly longer term outcomes
0:24:37 are really hard to imagine measuring,
0:24:40 let alone, you know, absent a randomized trial,
0:24:42 the, like if you weren’t able to randomize this,
0:24:44 which I think you won’t be able to,
0:24:48 the amount of time kids spend on these screens
0:24:52 is really wrapped up with other features of their household.
0:24:54 – Yeah, okay, so you have the definitions.
0:24:59 You have the time and the speed at which things are changing.
0:25:01 And then you have the willingness for people
0:25:03 to actually like engage and change
0:25:05 or doing things differently.
0:25:07 And then so all of that leads to what kind of,
0:25:10 so what do the studies actually tend to look like
0:25:12 in this space that we draw conclusions from?
0:25:14 – So actually there’s almost nothing
0:25:16 about iPads or phones.
0:25:18 – That seems so contrary to like
0:25:20 what the media is saying every five minutes.
0:25:22 – Yeah, so there’s tons of studies on TV,
0:25:24 which compare kids who watch more and less TV.
0:25:26 And, you know, you can, but most of that, again,
0:25:28 is sort of studies that are like based on data
0:25:31 where before people were watching TV on these screens,
0:25:35 maybe TV is TV, and you know, you can imagine
0:25:36 that that would be kind of similar.
0:25:39 But things like these apps, these just like no studies,
0:25:41 you know, or there’ll be, there’s like,
0:25:45 I think there’s one like abstract from a conference.
0:25:47 This is not a paper, there’s like answering comments
0:25:49 where it was just like we have some kids
0:25:51 and we like compare the kids who like spend,
0:25:52 like the babies who spend more time
0:25:54 watching their parents’ phones.
0:25:57 And then they like do worse, they’re like, look worse.
0:25:59 – But it’s like, it’s pathetic, it’s sad.
0:26:01 – It’s a terrible piece of evidence.
0:26:04 – So is this an area in which you just go with your gut?
0:26:07 – I mean, I try to generate a fancy version of go
0:26:10 with your gut, which is called Bayesian updating.
0:26:13 And so I basically try to say, look, you know,
0:26:16 I mean, we want to step back and think about
0:26:18 what are the places of uncertainty?
0:26:21 Logic would tell you, you know, your kid is awake
0:26:24 for what it is like 13 hours a day, 12 hours a day.
0:26:27 If your two year old is spending seven of those 12 hours
0:26:30 playing on the iPad, then there’s a lot of things
0:26:31 that they are not doing.
0:26:32 That’s probably not good.
0:26:36 On the other hand, you know, if your kid is spending
0:26:40 20 minutes every three days, it’s very hard to imagine
0:26:41 how that could be bad.
0:26:44 – So just thinking about it purely in times of like,
0:26:46 time allotted to any one activity, basically.
0:26:48 And then I think once you do that, then you’re sort of like,
0:26:50 okay, but you know, there are things that we’re uncertain
0:26:52 about, you know, what if my kid watches an hour of TV
0:26:54 every day or spends an hour on a screen every day?
0:26:56 Like is that too much, is the limit?
0:26:59 If we sort of accept like five minutes a day is fine.
0:27:00 Seven hours a day is too much.
0:27:03 Like is the limit at an hour, is the limit at two hours?
0:27:06 You know, and I think the truth is what we will find
0:27:08 if we end up in for doing any studies like this,
0:27:10 is that it depends a lot what other things
0:27:13 they would be doing with their time.
0:27:15 – Wouldn’t it also depend so much on the child?
0:27:19 – Some children need, you know, learn in a kind of way
0:27:21 that lends itself to this technology.
0:27:24 Some children need other kinds of learning, you know.
0:27:25 It’s highly individual.
0:27:27 – Yeah, I mean, I think this gets into the problem
0:27:29 with studying older kids in general,
0:27:30 that just like there’s so much,
0:27:31 there’s so many differences across kids.
0:27:33 It’s hard to even think about how you would structure
0:27:34 a study to learn about them.
0:27:37 Nevermind actually, like using evidence that exists.
0:27:40 – It’s really interesting because the last time we went
0:27:43 to take my daughter for her annual checkup,
0:27:44 or maybe it was my son, I can’t even remember it.
0:27:46 It’s so different from the first days
0:27:47 of those early spreadsheet where now I’m like,
0:27:49 did I even get it on which one is that?
0:27:51 – Yeah, exactly.
0:27:54 Anyways, the doctor said very concretely,
0:27:57 two hours, two hours max, within any date.
0:27:58 But it was really interesting to me
0:28:01 that it was such a specific line in the sand.
0:28:04 And now I’m thinking about how that information
0:28:06 would even get into that,
0:28:08 to percolate down to that level of like the system
0:28:10 and get kind of fossilized into the system,
0:28:12 so that that recommendation is being passed on to parents.
0:28:14 Like how does that happen with these studies?
0:28:18 How do they translate to that level of advice?
0:28:22 – Yeah, I think what happens is like organizations
0:28:23 like the American Academy of Pediatrics,
0:28:26 they bring people together to basically talk
0:28:28 about the conversation we just had,
0:28:30 which was like, okay, let’s agree,
0:28:31 sort of like we don’t know that much about this,
0:28:34 like five minutes seems fine, seven hours is too much.
0:28:36 These are like smart people who see kids a lot,
0:28:38 who presumably are using some knowledge
0:28:41 that they have about kids to pick some number.
0:28:43 But the answer is like, you could pick
0:28:44 a lot of different numbers.
0:28:47 We sort of say this and then it becomes like this rule.
0:28:49 And people have some impression that it comes
0:28:52 from some piece of evidence as opposed to sort of like,
0:28:58 you know, a synthesis of expert opinion or something,
0:29:01 which is really what it’s from.
0:29:04 – You also work specifically on certain health recommendations.
0:29:08 So how they change over time and how we stick to them.
0:29:10 You wrote a paper on behavioral feedback.
0:29:13 And then you talk about how those individual choices
0:29:15 might in fact be changing the science itself.
0:29:17 Can you talk about what that means
0:29:18 and how that might be happening?
0:29:20 – I was thinking about exactly this issue of like,
0:29:21 okay, we just make some recommendation
0:29:24 and sometimes those recommendations are kind of arbitrary,
0:29:25 but then they go out in the world
0:29:27 and people respond to them. – Take on lives of their own.
0:29:28 – Exactly.
0:29:30 And so like a sort of a good example,
0:29:32 vitamin E is supplements.
0:29:35 Like in the early 90s, there were a couple of studies
0:29:37 which suggested that like they are good for your health,
0:29:38 that like prevent cancer.
0:29:40 And so then there was like a recommendation,
0:29:42 like people should take vitamin E.
0:29:45 And then we had to ask a question like what,
0:29:47 like who takes vitamin E after that?
0:29:51 And one of the concerns is the kind of people
0:29:53 who would adopt these new recommendations,
0:29:55 like who listens to their doctor.
0:29:59 It is people who are probably,
0:30:01 maybe they’re more educated, maybe they’re richer,
0:30:05 but like above all, they are interested in their health.
0:30:07 So they are taking vitamin E, so they avoid cancer,
0:30:09 but they’re also exercising, so they avoid cancer.
0:30:11 And they’re eating vegetables, so they avoid cancer.
0:30:12 We use, call them selected.
0:30:14 These people are like positively selected
0:30:16 on other health things.
0:30:18 And so indeed you can see in the data
0:30:21 that the people who start taking vitamin E
0:30:22 after this recommendation changes
0:30:25 are kind of also exercising and not smoking
0:30:27 and doing all kinds of other stuff.
0:30:30 Well, why is that like interesting or problematic?
0:30:34 Well, later we’re gonna go back to the data
0:30:36 ’cause that’s like the way science works.
0:30:38 But now the people who take vitamin E
0:30:42 are even more different than they were before, right?
0:30:43 So now these people are like.
0:30:44 So you’ve added another layer.
0:30:45 You’ve added another layer.
0:30:47 So in fact, you can see that in the data.
0:30:49 You can see that basically before these recommendations
0:30:52 changed, there was sort of a small relationship
0:30:56 between taking vitamin E and like subsequent mortality rates.
0:30:58 But after the recommendations change,
0:31:01 you see like a very large relationship
0:31:03 between vitamin E and mortality rates.
0:31:06 And so it looks like basically ends up looking like vitamin E
0:31:08 like is really great for you.
0:31:09 – Has this big impact.
0:31:11 – But of course that’s because at least,
0:31:13 it seems like it must be at least in part
0:31:16 because the people who adopt vitamin E
0:31:21 are the people who are also doing these other things.
0:31:23 – So what does that mean then?
0:31:24 It feels like such a loss.
0:31:26 Like how does one ever- – It’s so depressing.
0:31:28 – Yes. (laughs)
0:31:31 How would one ever develop like a recommendation
0:31:33 based on what we think we know.
0:31:34 – I know.
0:31:36 – And untangle it from like-
0:31:39 – So this paper is very destructive in some sense.
0:31:42 – Other than saying like it probably doesn’t matter
0:31:44 if you take vitamin E, so that’s like news you can use.
0:31:46 You can take that home with you.
0:31:50 But I mean, I think it does more or less just highlight
0:31:54 some of the inherent and very deep limitations
0:31:58 with our ability to learn about some of these effects,
0:31:59 particularly when they’re small.
0:32:01 – Is this basically part of the sort of crisis
0:32:02 of reproducibility?
0:32:04 – I think it’s not unrelated.
0:32:06 So I often think about this idea of p-hacking,
0:32:10 which refers to the idea that you keep running your studies
0:32:12 until you get a significant result.
0:32:16 There’s a bunch of people interested in this process
0:32:19 of like how science evolves
0:32:24 and the ways in which the evolution of science
0:32:27 influences the science itself or the incentives
0:32:31 for research influence how science works.
0:32:35 And I think it’s particularly hard to draw conclusions
0:32:39 in these spaces like diet or these health behaviors
0:32:41 where the honest truth is probably a lot
0:32:43 of these effects are very small.
0:32:45 So if you ask the question like,
0:32:47 what is the effect of chia seeds on your health?
0:32:49 My dad is like really into chia seeds.
0:32:50 – That was a thing.
0:32:51 There was a moment.
0:32:53 – Well, he’s still in that moment.
0:32:53 He’s still in there.
0:32:56 And what is the effect of those on your health?
0:32:58 The actual effect is probably about zero.
0:33:00 Maybe it’s not exactly zero,
0:33:01 but it’s almost certainly about zero.
0:33:03 – But are there sometimes secret sleeper?
0:33:05 Like, whoa, they’re actually might,
0:33:08 the only way to find out is to do these things.
0:33:09 – Yeah, yeah.
0:33:10 And so maybe there are some secrets.
0:33:12 – Like maybe kale really is magic.
0:33:14 – Maybe it is, but it’s probably not.
0:33:18 I spent a lot of time with these diet data
0:33:21 and there’s these sort of like dietary patterns
0:33:23 like the Mediterranean diet,
0:33:26 which do seem to have some sort of vague support
0:33:30 in the evidence, but I would be extremely surprised
0:33:33 if we ever turn up like the one single thing.
0:33:35 – One magical food.
0:33:36 – So the point is it’s the pattern.
0:33:38 – It’s the pattern and it’s all the other things
0:33:39 that you’re doing, right?
0:33:42 If you smoke three packs a day and you never exercise,
0:33:45 but you eat some kale, that’s not gonna help you.
0:33:46 – Yeah, yeah.
0:33:47 – The kale’s not gonna help.
0:33:50 – What about when you really do need to affect change?
0:33:54 What are the ways in which these guidelines
0:33:57 can shift over time with kind of new sources of information
0:33:58 or data and statistics?
0:33:59 Like what’s the positive?
0:34:02 How does that actually play out in the right manner?
0:34:05 – Yeah, so I think there are times in which we,
0:34:08 the change in evidence is so big
0:34:12 and so like compelling that we can get changes,
0:34:15 best practices in obstetrics,
0:34:19 like how do you deliver, reach baby, as an example,
0:34:21 they change, like those changed over time
0:34:25 because there was like one very big well-recognized study
0:34:28 that everybody agreed like this is now the state of the art.
0:34:30 – And it happens fast at that point?
0:34:31 – And then it happens pretty fast.
0:34:32 It doesn’t happen immediately.
0:34:33 Like you might have thought that those kind of changes
0:34:35 could be like immediately affected
0:34:36 and I think that they’re not,
0:34:39 but they do happen over time.
0:34:44 Those examples really rely on there being like a cohort
0:34:49 of sort of like experts who are all reading the guidelines
0:34:51 and sort of seeing that they changed
0:34:56 and then themselves are sort of doing this all the time.
0:34:59 I think part of what’s hard in the broader health behavior
0:35:01 space where it’s people who need to make the choices,
0:35:03 not physicians.
0:35:06 – Yes, when it’s in the home and those dark bedrooms.
0:35:08 – It’s like that’s much harder to get people
0:35:10 to change their behavior in those spaces.
0:35:12 – It’s not these pediatric guidelines.
0:35:13 Those are not effective.
0:35:14 – Yeah, I do not think those are effective.
0:35:16 Or I think we don’t see any evidence in the data
0:35:18 that those are effective at moving these,
0:35:20 at least in these kind of spaces.
0:35:21 – So what’s the answer?
0:35:23 How do we positively affect change
0:35:25 and gather these insights and have smart people
0:35:26 making good recommendations?
0:35:29 – So I mean, I think one answer is media attention.
0:35:32 The kind of few times when we see very large spikes
0:35:35 and changes, they actually seem to correspond
0:35:36 with some media coverage.
0:35:39 On the flip side, like media can often be very bad.
0:35:41 Some of these big changes in these expert things
0:35:44 were kind of resulted from media coverage,
0:35:46 which was really like sensationalist
0:35:48 and like totally inappropriate.
0:35:50 And, you know, it wasn’t like a very nice,
0:35:53 like New York, New York time story about some study.
0:35:56 It was like a sensationalist 2020.
0:35:57 – About what?
0:36:00 – About, this is about vacuum extraction,
0:36:02 which is a way of pulling the baby out
0:36:04 and has gone down a lot over time.
0:36:05 And it was like the sort of sensationalist
0:36:08 like John Stossel 2020 episode
0:36:09 about how it could hurt your baby,
0:36:11 which caused like big productions.
0:36:12 – Interesting.
0:36:14 – Yeah, you know that like.
0:36:15 – But the science was there.
0:36:17 – The science, yeah, was there.
0:36:19 I mean, he overstated the science,
0:36:21 but it was, it was probably there.
0:36:23 – So it’s almost like a random confluence
0:36:24 of like when the science is there
0:36:26 and the media hits it the right way,
0:36:27 and then we see change?
0:36:28 – Yeah.
0:36:29 (laughing)
0:36:30 – It’s okay. – That’s something
0:36:31 to hope for.
0:36:34 – Yeah, that doesn’t feel like we can plan so much for that.
0:36:38 – You also study when we are resistant to change.
0:36:40 You looked specifically at diabetes,
0:36:42 people I think who had been diagnosed with diabetes
0:36:44 and then whether or not their behavior changed,
0:36:46 even given a certain amount of information.
0:36:49 So what do you see there about our resistance to change
0:36:52 even with the right kinds of information?
0:36:54 – I mean, I think one of the big challenges
0:36:56 in the health space at the moment
0:36:57 is that like so much of the,
0:37:00 so many of the health problems that we have in the US
0:37:03 are like problems associated with behavior,
0:37:06 just the fundamental fact that like people do not eat great
0:37:09 and we have a lot of morbidity
0:37:11 and expense associated with that.
0:37:14 And I think there is often a lot of emphasis
0:37:16 on the idea like if we just get the information out,
0:37:19 if people just understood vegetables were good for them,
0:37:20 hey, what are you vegetables?
0:37:21 – Doesn’t happen. – That’s not true.
0:37:24 I think, and so this paper is about sort of looking
0:37:26 at something where kind of a pretty extreme thing
0:37:28 happens to people, like they are diagnosed with diabetes
0:37:31 and we can see what happens to their diet.
0:37:33 And the answer is, it improves a tiny amount.
0:37:36 – Even with a real come to Jesus moment.
0:37:38 – Exactly, and a lot of new information.
0:37:41 – Right, and monitoring, follow-up, right?
0:37:42 I mean, you’re diagnosed with diabetes,
0:37:44 like you have to take medicine every day,
0:37:46 you got to go to the doctor like get a tester,
0:37:47 you know, test your insulin,
0:37:48 at least for some period of time.
0:37:51 So this isn’t like something where you can just forget
0:37:54 that it happened and even then the changes in diet,
0:37:56 you know, they’re there, but they’re really small.
0:38:00 They’re like, you know, like one less soda week or something.
0:38:01 – Oh gosh. – Like really,
0:38:02 like really small.
0:38:03 – And how are you noticing these?
0:38:06 – We’re inferring information on diagnosis
0:38:09 from people’s purchases of testing products
0:38:11 and then following their grocery purchases.
0:38:13 So this is like an example of using,
0:38:15 you know, a different kind of data.
0:38:16 So not health data in this case.
0:38:19 It’s actually like Nielsen data.
0:38:21 So Nielsen data on what people buy,
0:38:24 but then, you know, using some like machine learning techniques
0:38:27 to try to figure out from the kinds of things people buy,
0:38:29 when were they diagnosed with diabetes,
0:38:31 and then looking at their diets over time.
0:38:33 – So is the answer that there has to be
0:38:36 some sensational story that talks about like–
0:38:38 – I mean, I think there, I’m not even sure that would help.
0:38:40 I think part of the problem is people really like
0:38:42 the diets that they’re comfortable with.
0:38:45 Like diet is like such a habit formation thing,
0:38:48 and you know, people are willing to make
0:38:52 important health sacrifices to maintain the diet
0:38:53 that they like.
0:38:55 We get into some of these questions of preferences,
0:38:57 like, and you know, if people,
0:38:59 if that is the choice that people wanna make,
0:39:03 like should we be trying to intervene with policy?
0:39:05 Like let’s say everybody had all the information,
0:39:07 they knew that they shouldn’t drink so much soda
0:39:08 and that they should lose weight,
0:39:10 but they still chose not to.
0:39:13 Like do we wanna develop policies that affect that?
0:39:13 I’m not sure.
0:39:15 – Yeah, maybe that’s just free will.
0:39:16 – Yeah, maybe that’s just free will.
0:39:18 And it comes up in the parenting stuff too.
0:39:20 Like, you know, how much do we wanna be externally
0:39:23 controlling the choices people make with their kids,
0:39:25 even if we don’t think that they’re the right choices.
0:39:28 – But I do think there’s a segment of people
0:39:30 who want to make the change, but the gravity,
0:39:32 you know, because of the information,
0:39:33 but the gravity of the habit is so much
0:39:36 that it’s hard to know where to go about it.
0:39:39 – I guess I would say, where do you see this data going?
0:39:42 Like if you had your fantasy for where you want
0:39:44 the kind of data and the way that we see this data evolving
0:39:47 and the way that you see that kind of percolating out
0:39:50 to the public, I mean, in terms of being sort of a translator
0:39:51 and providing people the tools,
0:39:54 like what do you wanna see in terms of the way the system
0:39:57 responds to or integrates this data in the future?
0:40:00 – Yeah, I mean, I think the big message of the book
0:40:03 is in some sense that you should use the data
0:40:07 to make yourself confident and happy in your choices.
0:40:10 I think so much of what is hard about parenting
0:40:14 is that in the moment you are not often confident
0:40:17 in your choices, and then when somebody asks you,
0:40:20 like, why did you do that, then you feel bad, right?
0:40:23 And I think that there’s a sense in which sort of
0:40:24 looking at the data, but then confronting like,
0:40:27 well, we don’t know, but you’d be like, okay, I made this choice.
0:40:28 You know, I decided to let my kids watch an hour
0:40:31 of TV every day, because like I thought about it
0:40:33 and I thought there wasn’t any data,
0:40:35 and like that’s the choice that I made,
0:40:37 that sort of that confidence is like important
0:40:40 for being happy, and if we could sort of like
0:40:44 move in that direction, I think that would be good.
0:40:46 – It reminds me a lot of what one of my good friends,
0:40:47 Brandy said to me when I was in the trenches
0:40:49 of like babyhood and having a lot of anxieties
0:40:52 around all these hot button issues, breastfeeding,
0:40:53 sleep time, like all of it.
0:40:55 She had been through it, her kids were in college,
0:40:57 and she was like, let me give you a piece of advice.
0:40:59 Be wrong, but be wrong with confidence.
0:41:00 – Yes.
0:41:01 – Just be wrong with confidence.
0:41:02 That’s all that matters.
0:41:03 – Yeah. – Yeah.
0:41:04 – No, exactly.
0:41:05 – I love confidence, I love that.
0:41:07 Yes, I am wrong with confidence so frequently.
0:41:08 – Yes, and actually it turns out to be right.
0:41:09 – Like it turns out that it’s fine.
0:41:10 – The truth is there’s a lot of good options.
0:41:11 – A lot of good options.
0:41:12 – Yeah. – Thank you so much
0:41:14 for joining us on the A16Z podcast.
0:41:15 – Thank you for having me.
with Emily Oster (@ProfEmilyOster) and Hanne Tidnam (@omnivorousread)
Are chia seeds actually that good for you? Will Vitamin E keep you healthy? Will breastfeeding babies make them smarter? There’s maybe no other arena where understanding what the evidence truly tells us is harder than in health… and parenting. And yet we make decisions based on what we hear about in studies like the ones listed above every day. In this episode, Brown University economics professor Emily Oster, author of Expecting Better and the recently released book Cribsheet: A Data-driven Guide to Better, More Relaxed Parenting, from Birth to Preschool, in conversation with Hanne Tidnam, dives into what lies beneath those studies… and how to make smarter decisions based on them (or not). Oster walks us through the science and the data behind the studies we hear about — especially those hot-button parenting issues that are murkiest of all, from screen time to sleep training.
How we can tell what’s real and what’s not? Oster shows us the research about how these guidelines and advice that we are ”supposed” to follow get formalized and accepted inside and outside of healthcare settings — from obstetrics practices to pediatrics to diet and lifestyle; how they can (or can’t) be changed; and finally, how the course of science itself can be influenced by how these studies are done.