From Kill Switch: The Glassholes Are Back

0
0
AI transcript
0:00:01 This is an iHeart Podcast.
0:00:03 Guaranteed human.
0:00:06 Run a business and not thinking about podcasting?
0:00:07 Think again.
0:00:11 More Americans listen to podcasts than ad-supported streaming music from Spotify and Pandora.
0:00:15 And as the number one podcaster, iHeart’s twice as large as the next two combined.
0:00:17 Learn how podcasting can help your business.
0:00:19 Call 844-844-IHEART.
0:00:24 On Masters of Scale, iconic leaders reveal how they’ve beaten the odds.
0:00:28 Asking really strong questions is a superpower.
0:00:33 You want to show up with something radically different and how they’ve grown companies to incredible heights.
0:00:36 The greatest rewards always come from the greatest risks.
0:00:37 That’s hit the gas.
0:00:40 Airbnb, Zillow, Microsoft, Liquid Death, and more.
0:00:43 Hear from the founders who’ve changed the game.
0:00:46 It’s anything but business as usual.
0:00:53 Find Masters of Scale on Apple Podcasts, Spotify, YouTube, or wherever else you get podcasts.
0:01:04 Pushkin.
0:01:07 Hey, it’s Jacob.
0:01:12 Today we are going to play for you an episode of a podcast called Killswitch.
0:01:17 The show is hosted by a Pulitzer Prize-winning journalist named Dexter Thomas.
0:01:25 And it’s about technology, the problems that new technology can create, and how we navigate living in this technological world.
0:01:28 The episode you’re about to hear is about wearable tech.
0:01:34 Basically, how we got from Google Glass more than a decade ago to where we are now.
0:01:35 I hope you like it.
0:01:52 There’s this TikTok that somebody posted where she was getting waxed.
0:01:58 I had a Brazilian wax done about three weeks ago, and it’s been haunting me ever since.
0:02:04 The girl that was giving me the wax, she was wearing metaglasses.
0:02:10 And she’s like, they’re not charged, they’re not on, like I promise.
0:02:15 Wearable technology is having a moment.
0:02:24 There’s been this wave building for a while, where the stuff that seemed like science fiction, like having a computer on your wrist, is completely mainstream now.
0:02:29 And now, the next step of smart glasses is starting to get there too.
0:02:32 The best example of this is Meta’s Ray-Bans.
0:02:38 These glasses blend in so well that people are starting to wear them as their normal, everyday prescription glasses.
0:02:44 With the added benefit of being able to, among other things, record audio and video.
0:02:48 And this brings in a lot of questions about privacy.
0:02:50 These aren’t new questions.
0:02:52 We’ve had well over a decade to think about this.
0:03:00 But it’s got to be uncomfortable to have this societal quandary about surveillance thrown at you when you’re not wearing any pants.
0:03:06 I could not stop thinking, like, could this girl be filming me right now?
0:03:08 Like, could she be filming?
0:03:16 Meta will tell you that they have an LED indicator light that goes on when you are recording.
0:03:19 But someone’s going to figure out a way to hack that.
0:03:22 Victoria Song is a senior reviewer at The Verge.
0:03:27 Someone’s going to figure out a sticker to buy so that you can put it over the case.
0:03:31 Like, what is going to be the social etiquette going forward for these glasses?
0:03:34 Are they going to be banned from certain professions?
0:03:37 Victoria’s been testing and reporting on wearables for over a decade.
0:03:47 So she’s seen the technology evolve from fitness trackers that count your steps to rings that track your heart rate to smartwatches to pendants and glasses.
0:03:50 But this time feels a little different.
0:03:56 For whatever reason, the big tech powers that be have convened to decide that wearables are it.
0:04:00 Wearables are how we’re going to get AI to really take off.
0:04:06 Like, we’re going to have AI leave the computer and be on your body, be on you as a person.
0:04:10 And you’re going to take AI with you in every aspect of your life.
0:04:13 Is that the future that we’re heading towards?
0:04:14 Is that the future we want to head towards?
0:04:17 These are the conversations that we’re going to have to start having.
0:04:32 From Kaleidoscope and iHeart Podcast, this is Killswitch.
0:04:34 I’m Dexter Thomas.
0:04:36 I’m sorry.
0:05:22 This isn’t the first time we’ve run into questions about privacy and smart glasses.
0:05:29 Most people first started thinking about this a little over a decade ago in 2013 with the release of Google Glass.
0:05:37 When I first started thinking about wearables, there’s maybe something that I might be interested in was Google Glass.
0:05:38 Oh, yes.
0:05:39 Google Glass.
0:05:40 I see.
0:05:40 I see your face.
0:05:43 So we obviously got to talk about Google Glass.
0:05:47 When you first heard about Google Glass, what did you think about it?
0:05:51 It was sort of straight out of science fiction.
0:05:56 There was this marketing sizzle reel showing what Google Glass was supposed to be.
0:05:57 You ready?
0:05:58 Right there.
0:05:59 Okay, Glass.
0:06:00 Take a picture.
0:06:04 Okay, Glass.
0:06:04 Record a video.
0:06:23 And, you know, it’s an augmented reality device that shows you notifications of stuff that you need based on the world around you in real time.
0:06:28 It’s a total game changer if you really think about could this actually work.
0:06:35 A lot of people probably were thinking of things like Iron Man and Tony Stark’s smart glasses.
0:06:47 There’s just a lot of science fiction imagery that exists with heads-up displays, even though this technology actually dates all the way back to World War II with fighter pilots and heads-up displays in the windshields.
0:06:59 If you were to really trace the genesis of it, it’s a really old concept in technology, but I think Google Glass was the first that kind of plucked it from science fiction into the realm of like, hey, we have a product.
0:07:06 And it’s like, I think, what was it, like $1,500 somewhere in that ballpark, and you can buy it if you have the money.
0:07:15 And people did wear the Google Glass Explorer edition out into the world, and it was like a really surreal thing that happened.
0:07:18 Google Glass was not subtle.
0:07:19 It barely looked like glasses.
0:07:32 It was this bar that went across your forehead that sat on the bridge of your nose, and instead of lenses, there was this metal square thing and this inch-long transparent camera that stuck out over your right eye.
0:07:36 It reminded me of the Dragon Ball Z scouters that Vegeta wears.
0:07:37 Yo, yeah!
0:07:39 The power-level scouter!
0:07:46 Yeah, it reminded me kind of like that, because there wasn’t actually, like, glasses for people to see through.
0:07:50 Vegeta, what does the scouter say about his power level?
0:07:53 It’s over 9,000!
0:07:55 What, 9,000?
0:07:57 There’s no way that could be right!
0:07:59 Cut it!
0:08:01 Yo, I hadn’t thought about that.
0:08:06 It looks like the Dragon Ball Z, but the scouter, oh my god.
0:08:07 Yeah.
0:08:14 You know, Google Glass really leaned into the science fiction aesthetic of it, right?
0:08:16 It wasn’t supposed to blend in.
0:08:18 You were supposed to stand out when you wore it.
0:08:20 Yeah, it was very distinctive.
0:08:23 When you saw someone wearing it, you’d see it, and you’d immediately clock it.
0:08:29 People would immediately clock it, and in a lot of cases, they’d make fun of it.
0:08:32 Not just because of surveillance or because they didn’t want to be recorded.
0:08:36 Sometimes the criticism was just more surface-level and basic.
0:08:43 Like, why would you want to wear that thing on your face and show everybody that you’re a weirdo who records everything?
0:08:47 Back when it came out, The Daily Show had a whole segment that was making fun of it.
0:08:58 Our nation has long been haunted by discrimination, and while we’ve made great strides over the years overcoming the challenges, there are still those that suffer from the barbs of injustice.
0:09:02 I was denied admission and service at various establishments.
0:09:03 I was mugged in the Mission District.
0:09:09 I was asked to leave a coffee shop, and the reason why was because we don’t do that here.
0:09:28 Yes, it seems even in this day and age, you can still be treated differently just because of how you look.
0:09:31 Wearing a $1,500 face computer.
0:09:36 So, you know, that’s why it caused the controversies that it did.
0:09:44 I think the most famous example was just a woman wearing the Google Glass while she was out in San Francisco and someone ripping it off her face.
0:09:45 Video.
0:09:47 Okay.
0:09:49 It’s on video now.
0:09:51 Okay.
0:10:00 Sarah Slocum was a social media consultant, and she was wearing a Google Glass at a bar in San Francisco.
0:10:06 She was showing some friends how Google Glass works, and some other people at the bar saw it,
0:10:10 and got mad at her, and started yelling at her because they didn’t want to be recorded.
0:10:20 So, she started recording a video of them yelling at her, which maybe is kind of ironic, but maybe also a preview of what the future was going to look like.
0:10:26 So, it’s kind of hard to tell because everyone’s yelling, but there’s a moment in the footage where you can hear someone say,
0:10:29 Get your Google Glasses out of here.
0:10:32 They start yelling back and forth, and she also flips them off.
0:10:35 And again, this is all on the video that she’s recording.
0:10:38 And then someone grabs the glasses and rips them off of her face.
0:10:45 Sarah Slocum was featured in the local news about this, and she also told her story in a Daily Show segment about what happened.
0:10:51 I was at a bar, and people started verbally accosting me.
0:10:54 They started getting physical immediately when I started recording.
0:10:56 They ripped them off my face, basically, and they ran outside.
0:10:58 It was a hate crime.
0:11:02 The silly thing is that they’re going to be wearing these things probably in a year.
0:11:13 I’m going to reserve my comments about calling this a hate crime, and instead, I’m just going to move on to the facts and say that Sarah Slocum’s prediction was incorrect.
0:11:16 Pretty much nobody was wearing these things in a year.
0:11:21 When you looked at it, you couldn’t help but wonder, oh, is this person recording me?
0:11:30 And I mean, there was a recording light and all of that stuff, but it really kind of forced the whole sci-fi aspect of it into current everyday life
0:11:35 in a way that I don’t think society was really prepared for at that point in time.
0:11:42 So I also remember when Google Glass was announced, and actually thinking, oh, you know what?
0:11:43 This would actually be kind of cool.
0:11:49 My thought was I could record concerts or something like that.
0:11:53 That might be kind of cool because using your phone, it wasn’t really a great experience either.
0:11:58 You’re in people’s way, so I’m going to artist shows who I know.
0:12:04 Sometimes they ask me to film their sets, but it’s kind of weird being a guy up on the stage with a camera.
0:12:13 But I remember going from being kind of interested in the Google Glass to realizing, oh, I think maybe people hate this.
0:12:15 And then it was gone.
0:12:18 The life cycle of it was so fast.
0:12:19 Did it feel like that for you?
0:12:21 Oh, it was absolutely really fast.
0:12:23 And there’s a lot of factors that go into that.
0:12:28 One, the price was prohibitive to the things that it could actually do.
0:12:32 Like the video that they showed was truly sci-fi revolutionary.
0:12:35 Couldn’t actually deliver on that, right?
0:12:38 So it was much more limited in what it could actually do.
0:12:45 And then it was so ostentatious when you wore it that there’s no way of being discreet, right?
0:12:51 So you’re basically becoming an ambassador of future technology for the average person.
0:12:55 Like, do you want to have people come up to you on the street and be like, well, what is it that you’re wearing?
0:12:56 Are you filming me?
0:12:56 Are you recording me?
0:12:58 Nah, you’re out here just trying to have a nice time.
0:13:03 And, you know, in 2013, we were not in the TikTok era.
0:13:08 We were not in an era where everyone is used to people just whipping out their phones and filming life.
0:13:14 The definition of a public space and a private space was different 10 years ago than it is now.
0:13:24 I think Google Glass was like a really unfortunate example of a good idea or at least like a substantive idea and really poor timing.
0:13:31 And, of course, there was the phrase that went along with anybody who had it, whether or not they were doing anything wrong, Glasshole.
0:13:32 Yeah, Glasshole.
0:13:37 I kind of feel for Google because they’re never going to live that down.
0:13:42 I don’t know if I’ve ever talked to anybody who said they feel bad for Google.
0:13:43 This is a new one.
0:13:47 It’s one of those things where it’s like, I see the vision, you know, pun intended.
0:13:50 I see where you really wanted to go with that.
0:13:58 I don’t think it was necessarily a bad idea, but you kind of made some choices that were not in step with reality.
0:14:07 And now you have to live with the fact that you coined the term Glasshole, like indelible, forever, in perpetuity.
0:14:11 Google Glass was a massive failure for Google.
0:14:16 They did try to salvage it by selling to companies where maybe there would be some industrial uses for it.
0:14:26 The idea was that this whole thing about privacy and people feeling uncomfortable wouldn’t really matter if it’s a worker in a factory using Google Glass to check inventory or something.
0:14:28 But that didn’t really catch on either.
0:14:32 And Google basically gave up on trying to push smart glasses.
0:14:36 It looked like, as a society, we just didn’t want it.
0:14:42 But now, all of a sudden, there’s been a resurgence in wearables and specifically in smart glasses.
0:14:46 And all those same questions are coming back up.
0:14:51 So maybe saying that society didn’t want it is the wrong framing.
0:14:54 Maybe it’s just that we didn’t want it yet.
0:14:57 So are we ready for smart glasses now?
0:14:59 That’s after the break.
0:15:08 Run a business and not thinking about podcasting?
0:15:09 Think again.
0:15:14 More Americans listen to podcasts than ad-supported streaming music from Spotify and Pandora.
0:15:18 And as the number one podcaster, iHeart’s twice as large as the next two combined.
0:15:21 So whatever your customers listen to, they’ll hear your message.
0:15:25 Plus, only iHeart can extend your message to audiences across broadcast radio.
0:15:27 Think podcasting can help your business?
0:15:28 Think iHeart.
0:15:31 Streaming, radio, and podcasting.
0:15:33 Let us show you at iHeartAdvertising.com.
0:15:36 That’s iHeartAdvertising.com.
0:15:40 On Masters of Scale, iconic leaders reveal how they’ve beaten the odds.
0:15:44 Asking really strong questions is a superpower.
0:15:50 You want to show up with something radically different and how they’ve grown companies to incredible heights.
0:15:52 The greatest rewards always come from the greatest risks.
0:15:53 That’s hit the gas.
0:15:57 Airbnb, Zillow, Microsoft, Liquid Death, and more.
0:16:00 Hear from the founders who’ve changed the game.
0:16:03 It’s anything but business as usual.
0:16:10 Find Masters of Scale on Apple Podcasts, Spotify, YouTube, or wherever else you get podcasts.
0:16:21 Where are we right now with wearables and specifically with smart glasses?
0:16:25 Right now, I would say we are entering the age of AI hardware.
0:16:35 There are these wearable pendants that are always on you at any given point in time, listening to every single conversation you have, acting as your quote-unquote second memory.
0:16:38 That’s one vein of wearable AI hardware.
0:16:45 The others are things like headphones and smart watches, so existing popular forms.
0:16:50 And then kind of coming out of left field again, we have smart glasses.
0:16:58 Specifically, the Ray-Ban meta smart glasses emerging as a kind of third track of AI hardware.
0:17:03 You know, I wrote about them, meta smart glasses, and I had their comms.
0:17:05 People reach out to me and be like, actually, could you call it AI glasses?
0:17:14 Because this is how closely tied together their thesis of AI and glasses are.
0:17:22 Their CTO, I believe, Andrew Bosworth, has basically gone on the record saying, like, this is how we get AI to really take off.
0:17:23 And it’s smart glasses.
0:17:31 This is the ideal hardware form factor for AI going forward will be in these smart glasses.
0:17:42 This is kind of interesting because if you look on some corners of social media, you’ll see a lot of people get mad about AI features being added to products that really would be fine without it.
0:17:47 But tech companies are still obviously trying to put AI into everything anyway.
0:17:52 And in wearables, they seem to be betting on us accepting it.
0:17:58 People have had ChatGPT, Claude, all of these AI assistants on their computers for a while.
0:18:02 And a lot of us have come to the same conclusion that AI can be incredibly dumb.
0:18:14 But if it’s on you, and from all the conversations that I’ve been having with AI startups and with big tech investing in the future of AI, if it’s on you, that’s a different proposition.
0:18:19 And with the meta Ray-Ban specifically, the AI is sort of a bonus feature.
0:18:25 It’s a way for meta to get people accustomed to AI without explicitly having to force it on them.
0:18:29 These are functional devices where you don’t have to use the AI.
0:18:31 You never have to say meta AI.
0:18:33 You don’t ever have to use that wake word.
0:18:34 But it’s there.
0:18:36 If you’re curious, you could use it.
0:18:47 And because there are other reasons why you would use these glasses besides AI, it becomes a very easy entryway, a door for meta to open.
0:18:50 And say, hey, you’re using it for these purposes.
0:18:57 Wouldn’t it be really great if you wanted to use it sort of like how you use Siri, but actually can do more?
0:19:01 And why don’t you experiment with these actually very useful use cases?
0:19:09 And that’s why these particular glasses have actually caught on like wildfire in the blind and low vision community.
0:19:16 Because the AI features genuinely have created a game-changing, life-changing technology for them.
0:19:26 Like I’ve spoken with a bunch of blind and low vision users who tell me that the meta glasses enable them to live more independent lives.
0:19:32 They have this live AI mode that allows you to just ask the AI what’s going on in my surroundings.
0:19:34 It can read a menu for you.
0:19:43 And for a sighted person, that may not, that might be like, oh my God, why do I need an AI to read a menu for me unless it’s in a different language and translating it?
0:19:51 But for a person who is low vision or blind, just asking an AI, hey, you know, my kitchen is really messy.
0:19:54 Where is this particular appliance?
0:20:05 And having the AI be able to tell you to save you some emotional labor, to save you some of the toll of having to ask a living person to see your very messy kitchen.
0:20:10 It provides an actual life-changing service for that particular community.
0:20:13 Victoria is not kidding.
0:20:17 I know someone who’s blind and they let me use their smart glasses once.
0:20:19 And it was genuinely amazing.
0:20:27 So we were walking around on this college campus and the glasses had these little speakers in them and it would read signs to me from across the quad.
0:20:31 It could tell me if there was a lamp pole or a bicycle in my path.
0:20:36 Basically, just a whole bunch of things that would help somebody to lead an independent life.
0:20:47 So if someone ever tells me that AI is useless, I got to stop them right there because even just from my very limited peek into what this can do for people, it can be life-changing.
0:20:54 But I don’t think that reason is why smart glasses are taking off now in a way that Google Glass didn’t.
0:20:56 It’s not just the technology.
0:20:57 It’s the timing.
0:21:07 We’re in an age now where people are filming each other all the time, whether it’s making TikToks on their phone or surveilling their neighbors with a ring camera.
0:21:12 We’re used to seeing cameras outside in a way that we just weren’t in 2013.
0:21:22 So it sort of makes sense that even though these things are conceptually very similar to Google Glass, the Meta Ray-Bans are being more widely adopted now.
0:21:28 I’ve seen them in the wild quite a bunch lately, and it’s from non-techies.
0:21:32 Like, it makes sense to me if I go to CES and there’s a bunch of people wearing them.
0:21:36 Makes sense if I’m at a press conference and there’s a lot of influencers around.
0:21:40 But I was just walking out and I was seeing a friend that I hadn’t seen in 10 years.
0:21:43 And I looked at them and I was like, are those Meta Ray-Bans smart glasses?
0:21:44 And they’re like, oh, yeah, I love them.
0:21:47 I love to take concert footage with them.
0:21:50 And, you know, I was at a concert in June.
0:21:51 I turned to my side.
0:21:52 There’s a girl.
0:21:54 She has Meta Ray-Bans.
0:21:59 And I find her TikTok later that day from that concert with the footage taken from these glasses.
0:22:03 So to your point, that example has become real.
0:22:09 It is a thing that the technology has finally caught up and it’s a discreet form factor and it’s so much more affordable.
0:22:11 It’s around like $200 to $300.
0:22:17 And here’s the other thing that I think is kind of a factor in it is that it lets you use your phone less.
0:22:22 And phones are almost 20 years old at this point.
0:22:26 And we’re, culturally speaking, kind of fatigued.
0:22:39 There’s so many products out there about helping you focus more, putting down your phone, locking it, fricking the screen so that you’re not, like, glued to this device anymore.
0:22:43 And so the proposition now with wearables is that it enables you to do that.
0:22:44 You can still stay connected.
0:22:47 You can still get your notifications and be reachable.
0:22:51 But you can triage it and you don’t have to look at your phone quite as much.
0:23:02 The reason why I knew that these would not just fall away and be completely Google Glass 2.0 was my spouse has become a Luddite.
0:23:07 They absolutely abhor my job and all the wearable technology that I test.
0:23:10 And they were like, I’m going to cop me a pair of those.
0:23:12 Really? The Ray-Bans?
0:23:13 They got their own pair.
0:23:14 Yeah, they got their own pair of Ray-Bans.
0:23:18 That is their main pair of glasses that they wear every day.
0:23:24 They love cars, so they go walk on the street and they can go, oh, hey, Meta, what model car is that?
0:23:28 And, like, in that instant, not have to pull out their phone and look all that stuff up.
0:23:30 That was compelling enough for them.
0:23:32 And the price point was good enough.
0:23:35 And the look of the glasses was not dorky.
0:23:40 It was just a complete, like, whoa moment for me.
0:23:50 On top of the timing and the price, the look of the Meta Ray-Bans is probably another really big factor of why these are working in a way that Google Glass never did.
0:23:57 Because you don’t have to consciously sign up to be, as Victoria called it, an ambassador for the future when you wear these things.
0:24:02 And in retrospect, that’s probably another place where Google Glass kind of messed up.
0:24:10 They were relying on early adopters to be ambassadors and hoping that they would get evangelists that would make everyone else want to jump on board.
0:24:15 Instead, they got Sarah Slocum, the young lady who got her glasses ripped off of her face.
0:24:22 But you don’t need a single ambassador if the product just looks like something that’s already out there.
0:24:24 Everyone is already an ambassador.
0:24:26 You’re just joining the club.
0:24:30 That’s one thing that Meta hit right on the nail.
0:24:35 They partnered with Estelor Luxottica, which is the biggest eyewear brand in the world.
0:24:42 Has a bunch of different brands under its umbrella, including Oakley’s, which they just released a pair of smart glasses with under the Oakley branding.
0:24:45 And Ray-Bans, which iconic worldwide.
0:24:54 So to be able to have those fashionable brands and to say, here you go, you’re going to look, if not stylish, normal in them.
0:24:56 It was a huge thing.
0:25:03 For smart glasses to go truly mainstream, they have to be good looking because humans are vain creatures.
0:25:07 We are constantly obsessed with what’s going to make us look good.
0:25:11 And I don’t care if you have the coolest piece of tech on the planet that’s the most convenient.
0:25:14 You’re putting it on your face, on your eyes.
0:25:15 All right.
0:25:19 So some of y’all have listened this far and thought, yo, this sounds terrible.
0:25:22 Anybody who wears these things is a complete weirdo.
0:25:27 But some people you’re listening and you’re thinking, these actually sound kind of cool.
0:25:30 You might be thinking about getting a pair or maybe you already have a pair.
0:25:32 So here we are again.
0:25:41 We got to realize that even though it felt like we societally rejected Google Glass 10 years ago, maybe all we did was just kick the can down the road.
0:25:43 We just put off making a decision.
0:25:48 Maybe when Sarah Slocum said that we’d all be wearing these, she wasn’t wrong.
0:25:51 She was just a little too early.
0:25:55 Which brings us back to the TikToker getting a Brazilian wax.
0:25:59 The question of how much privacy do we really want?
0:26:03 We got a listener question.
0:26:12 It was a spicy question from a VergeCast listener where they were like, is it okay for me to wear it while I have intimate relations with my wife?
0:26:21 And I was like, oh, that is genuinely a question that we have to grapple with if these are to become a mainstream piece of technologies.
0:26:24 When is it okay to wear these devices?
0:26:27 What conversations are you supposed to have when you’re wearing these devices?
0:26:29 Is it just on vibes?
0:26:32 Are we treating them like smartphone cameras, right?
0:26:33 Right.
0:26:39 My contention is that a smartphone, you know when someone’s recording because they hold it a specific way.
0:26:45 There’s just kind of like a body language that goes with holding a camera and recording.
0:26:51 But because of the form factor being so discreet, which is a benefit in many ways.
0:26:55 Like for content creators who are trying to do first person point of view.
0:27:06 But at the same time, is someone just adjusting their glasses in a specific way going to start looking like you’re recording something even if you don’t have smart glasses?
0:27:11 You know, the Meta Ray-Bans do have a indicator LED light, which tells you when someone is recording.
0:27:15 And very bright outdoor lighting, I would say most people would not notice it.
0:27:19 And that is a thing that I test for every single iteration that comes out.
0:27:25 Living in 2025 means you know that you could be recorded at any time.
0:27:27 But these glasses add another layer.
0:27:32 When someone pulls out a phone and starts recording, at least you have a visual indication that it’s happening.
0:27:35 But what happens when you don’t know you’re being recorded?
0:27:37 That’s after the break.
0:27:45 Run a business and not thinking about podcasting?
0:27:46 Think again.
0:27:51 More Americans listen to podcasts than ad-supported streaming music from Spotify and Pandora.
0:27:55 And as the number one podcaster, iHeart’s twice as large as the next two combined.
0:27:58 So whatever your customers listen to, they’ll hear your message.
0:28:02 Plus, only iHeart can extend your message to audiences across broadcast radio.
0:28:04 Think podcasting can help your business?
0:28:05 Think iHeart.
0:28:08 Streaming, radio, and podcasting.
0:28:10 Let us show you at iHeartAdvertising.com.
0:28:12 That’s iHeartAdvertising.com.
0:28:17 On Masters of Scale, iconic leaders reveal how they’ve beaten the odds.
0:28:21 Asking really strong questions is a superpower.
0:28:27 You want to show up with something radically different and how they’ve grown companies to incredible heights.
0:28:29 The greatest rewards always come from the greatest risks.
0:28:30 That’s hit the gas.
0:28:59 So, if you’re outside while you’re listening to this, maybe you’re going shopping, maybe you’re going for a run, taking a walk in the park.
0:29:02 I want you to enjoy this moment.
0:29:04 Just take it in, seriously.
0:29:12 Because it might not be that much longer that you can do this privately without having another person looking you up.
0:29:18 What I mean by that is that the privacy issue of the Meta Ray-Bans goes beyond just being recorded without your consent.
0:29:28 Last year, a couple of Harvard students were able to combine the recording and the AI functionality of the Meta Glasses to dox people in real time as they walked by.
0:29:31 We built glasses that let you identify anybody on the street.
0:29:33 Uh, Cambridge Community Foundation.
0:29:37 Oh, hi, ma’am.
0:29:39 Wait, are you a Betsy b****?
0:29:40 Yes.
0:29:40 Oh, okay.
0:29:44 I think I, uh, I think I met you through, like, the Cambridge Community Foundation, right?
0:29:44 Yeah.
0:29:45 Yeah, yeah.
0:29:46 It’s great to meet you.
0:29:46 I’m Kane.
0:29:47 Terrifying.
0:29:49 Just absolutely terrifying.
0:29:51 And they’re college students who were able to put that together.
0:29:54 They just were able to jerry-rig something.
0:29:58 And ironically, several months later, they are now coming out with their own Smart Glasses product.
0:30:08 So it’s just kind of, it’s a whole uroboros of just, you know, on the one hand, they were raising awareness like, oh my god, this could be VCU’s.
0:30:10 But also, we have a product now.
0:30:10 Cool.
0:30:17 Now, to be fair, this new Smart Glasses product that they’re selling is not the facial recognition lookup that they demoed on campus.
0:30:21 Their pitch is that their new glasses go further than Metis Glasses.
0:30:25 Metis Glasses just turn on and record when you tell them to.
0:30:27 Their product will always be on.
0:30:30 One of the founders told the magazine Futurism that, quote,
0:30:49 And their glasses won’t have an indicator light that tells you it’s recording because, again, it’s always recording.
0:30:57 This could bring up some legal issues, which, by the way, are issues that other recording wearables are probably also going to run into at some point.
0:31:01 Some people live in two-party consent recording states.
0:31:06 So you have companies making tech that could be illegal in some respects.
0:31:07 Yeah.
0:31:12 In California, specifically, both parties have to consent to being recorded.
0:31:17 So I live in New Jersey and work in New York, which are both one-party consent states.
0:31:24 So technically, I can walk out into public spaces with the thing on and it’s recording, and I don’t require anybody else’s consent.
0:31:31 But if I’m going to California, is it okay for me to wear an always-on recording device while I’m on public transit?
0:31:37 It suddenly becomes a very strange, murky, gray area.
0:31:46 If you look at Meta’s privacy policy for the smart glasses, what they say is the best practices for using these devices out in the world.
0:31:48 And they can basically say, hey, we published this.
0:31:51 We’ve told people, don’t be a jerk.
0:31:52 We’re good, right?
0:32:00 And when people inevitably are jerks using their technology, their defense is that they say, well, we never intended them to use it that way.
0:32:03 Like, you can think about AirTags in that respect.
0:32:09 Apple came out and were like, we made this an incredibly convenient device for you.
0:32:17 And a small bunch of bad apples are going to use it to stalk people and use them in ways that we absolutely didn’t intend.
0:32:20 But in the fine print, we’re going to say that that’s illegal.
0:32:21 We don’t condone that.
0:32:23 Legally, we’re scot-free.
0:32:31 So if you’re not familiar with these, Apple’s AirTags are these small tracking devices that you can attach to your keys or your wallet to keep track of them.
0:32:37 Products like this existed before Apple’s version, but Apple just made them more convenient, which made them more mainstream.
0:32:45 And after the product became more mainstream, the obvious bad things you can do with this technology also became more mainstream.
0:32:52 People started sneaking AirTags into people’s purses or attaching them to their cars so they could track them and stalk them.
0:32:58 So Apple did make a notification system that would alert you if an unknown tracking device is following you.
0:33:03 But you would only get that notification if you also had an Apple iPhone.
0:33:08 Months later, they did put something out for Android, but even then you had to know how to use it.
0:33:17 And as cases of people being stalked kept hitting the news, they’d keep making modifications, like making the AirTag beep more if it’s away from its owner for long enough.
0:33:20 But of course, people started working on how to disable that.
0:33:23 There was another solution for this.
0:33:26 Stop with the software updates and just cancel the product, pull it off the shelves.
0:33:31 But Apple didn’t do that because AirTags are really popular.
0:33:34 People really like being able to find their lost stuff.
0:33:43 In the case of the Apple AirTags and now for the Meta Ray-Bans, the trick seems to be to find enough consumers for whom the product is indispensable.
0:33:47 People who think that the benefits outweigh the risks.
0:34:03 I think this is what really the crux of all of this lies on is that AirTags are so convenient that most people, the vast majority of people will be like, yeah, I’m good with that because AirTags are so convenient and I’m not the bad apple using it in that way.
0:34:13 It’s like for what app, for what use case will be so convenient that you are willing to overlook the dystopian nightmare that comes along with it.
0:34:29 I mean, I’m feeling like even if the vast majority of people with the Meta Ray-Bans or any of the smart glasses use them in very responsible ways, just the fact that it’s out there is going to alter how we can walk around in society, period.
0:34:31 No, you’re absolutely right.
0:34:37 Like in testing these devices, I don’t speak to myself as much as I used to because I wore one of these devices into a bathroom.
0:34:41 I commented on my bowel movement and it recorded it.
0:34:45 And then it generated a to-do for me to have lactate again.
0:34:49 And I was like, this is the rudest thing that’s ever happened to me.
0:35:09 But also, holy crap, this is our dystopian future because when the AI is in your glasses, when the AI is independent that sits around your neck, when it’s in your smartwatch, when it’s on you 24-7 and you just have an unfiltered thought that you speak aloud to yourself, well, suddenly you’re not the only one listening to your own thoughts.
0:35:13 It’s an AI that’s listening to your thoughts and it’s going, oh, they mentioned that.
0:35:16 Maybe this is a thing that I will generate a to-do list for.
0:35:19 And that sounds like a convenient application.
0:35:21 And there’s that word again, right?
0:35:21 Convenient.
0:35:25 You say something out loud and your smart glasses remember it for you.
0:35:28 But it’s not remembering it for you.
0:35:31 It’s also remembering it for the company and for the advertisers.
0:35:37 You blurt out something unconsciously about your head hurting or you’re around somebody else as they’re talking about having a headache.
0:35:51 And next thing you know, you’re getting a bunch of targeted ads for a very specific brand of headache medicine that has paid to be at the top of the list for the demographic of 25 to 35-year-old women who like cold brew coffee, live in urban environments, and like techno music.
0:35:58 If we want to get super dystopian about it, we live in the engagement economy, right?
0:36:05 The engagement economy requires a constant feed of data and personalization and all that sort of stuff.
0:36:08 What better AI training tool do you have than a wearable?
0:36:14 I sit up at night and I think about it and I was like, legitimately, we started out just tracking our steps.
0:36:16 And now it’s your heart rate.
0:36:21 They’re working on finding ways to tell you if your blood pressure is high or low, if your blood glucose is high or low.
0:36:28 So they’re looking to how can they feed a crap ton more data so that we can know more about you.
0:36:29 Give you more ads.
0:36:30 Give you more ads.
0:36:32 Personalized experiences.
0:36:38 Like, this is just my, I don’t know if you’ve seen the meme of Charlie from It’s Always Sunny with the red string on the board.
0:36:39 This is my conspiracy theory.
0:36:40 Yes, yes.
0:36:41 As to where wearables are going.
0:36:44 I don’t think that’s a conspiracy theory.
0:36:54 I think it’s a very reasonable thing to assume is that the more data is collected about you, it can be used to show you ads that you will not scroll past.
0:36:55 Yeah.
0:37:04 Got kind of a preview of this, of what society looks like because you’re around the tech type people all the time.
0:37:15 What does a society look like when we know that there’s a good chance that somebody around you is recording everything they’re seeing and hearing?
0:37:19 I think it’s a much more self-conscious society.
0:37:27 I have become someone who, when I’m out and about, I am scanning the glasses that people wear to see if there’s cameras on them.
0:37:29 I had a kind of a unique upbringing.
0:37:36 There’s a question of whether my dad was a North Korean spy or not and whether we were under surveillance at any given point in time.
0:37:39 And so, I grew up always thinking my life is kind of public.
0:37:43 I have to perform as if I’m always being watched.
0:37:45 So, I kind of grew up with that my whole life.
0:37:50 But it’s a heavy thing to grow up with.
0:38:02 And I think, you know, a lot of people are privileged and blessed to not grow up performing in that way as if there’s a movie set invisible camera on you at all times.
0:38:06 But I think that is just going to be a reality that everyone starts to do.
0:38:10 You start to become a lot more conscious of your actions in public.
0:38:15 You start to become conscious of the spaces in your house that are truly private.
0:38:22 You know, I say to people all the time that the only truly private place you have in this world is the inside of your head.
0:38:24 Like, that’s kind of dystopian when you say that.
0:38:37 But living the life that I do, testing the products that I do, having the upbringing that I had, I unfortunately think I am well-equipped to tell people that this is what’s coming.
0:38:45 I think we’re all going to have to live our lives as if we’re mini-celebrities out in public at any given point in time and that the paparazzo could come for you.
0:38:47 And there’s degrees of that.
0:38:52 Not all of us are Timothee Chalamet living out here, having to wear caps and disguises.
0:38:57 But to a degree, I do think we’re all going to be living very public lives.
0:38:59 Whether or not we want to.
0:39:04 Whether or not we want to, I think we are all in some way going to be living as public figures.
0:39:13 So I think as a society going forward, we really have to think about what are truly private spaces and what truly private spaces we want to protect.
0:39:17 Because it’s a human need to need that privacy.
0:39:23 And I don’t think that it should be given up for whatever convenience.
0:39:26 Like, however tempting the convenience is.
0:39:28 And sometimes the inconvenience is necessary.
0:39:42 It seems like this is another one of those conversations where we’re having as a society where the outcome seems predetermined, which is to say, these are eventually going to be adopted by everyone.
0:39:44 It’s just a matter of time.
0:39:50 This isn’t a decision that you as an individual get to make.
0:39:54 Look, you don’t have to like these glasses, but they’re going to hit mainstream adoption.
0:39:56 Everybody’s going to be wearing them.
0:39:59 And so you can choose not to buy them if you don’t want to.
0:40:03 But you cannot choose to live in a world that doesn’t have them.
0:40:07 And it feels like that is where we’re at right now.
0:40:08 Do you think we’re there?
0:40:10 I think you’re spot on because I think the sales prove it.
0:40:18 Google believes that the zeitgeist is strong enough for them to be like, hey, guys, smart glasses.
0:40:24 Let’s get back in on that and rebrand ourselves as the people with the most experience in the space.
0:40:28 So at the end of last year, they put me back in, coach.
0:40:36 Like last at the end of last year, they legitimately launched Android XR, which is going to be their platform for smart glasses and headsets.
0:40:39 Why would you say now is the right moment to launch XR?
0:40:45 I think now is the perfect time to work on XR because you have a convergence of all these technologies.
0:40:49 We’ve been in this space since Google Glass and we have not stopped.
0:40:55 We can take those experiences, which already work great, and find new ways to be helpful for people.
0:41:04 Once Google is able to overcome their PTSD trauma over Google Glasses to be like, we want back in on the thing that people made the most fun of us for.
0:41:06 I think it is inevitable.
0:41:07 I think you’re spot on about that.
0:41:14 Thank you so much for listening to Killswitch.
0:41:15 I hope you dug this one.
0:41:19 If you want to connect with us, we’re on Instagram at KillswitchPod.
0:41:24 Or you can email us at Killswitch at Kaleidoscope.nyc.
0:41:32 And, you know, while you got your phone in your hand, whatever, before you put it back in your pocket, maybe wherever you’re listening to this podcast, leave us a review.
0:41:36 It helps other people find the show, which in turn helps us keep doing our thing.
0:41:39 Killswitch is hosted by me, Dexter Thomas.
0:41:43 It’s produced by Sheena Ozaki, Dara Lick-Potz, and Kate Osborne.
0:41:46 Our theme song is by me and Kyle Murdoch.
0:41:48 And Kyle also mixes the show.
0:41:54 From Kaleidoscope, our executive producers are Ozvalashin, Mangesh Hatikudur, and Kate Osborne.
0:41:58 From iHeart, our executive producers are Katrina Norville and Nikki Etor.
0:42:00 Oh, one more thing.
0:42:04 So there’s that clip that we played of the Dragon Ball Scouter exploding.
0:42:06 And maybe you were wondering why it exploded.
0:42:11 Or if there’s any scientific basis for why a Scouter would explode when the power levels are too high.
0:42:14 Okay, maybe you weren’t wondering that, but I was.
0:42:28 And it turns out that the official Dragon Ball site published an article that kind of explains it featuring an interview with a professor in the engineering department at Miyazaki University who talks about using AI headsets to measure the weight of pigs.
0:42:31 I promise you, I promise you, I’m not making this up.
0:42:32 I’ll leave that in the show notes.
0:42:34 Anyway, catch you on the next one.
0:42:58 If a Lenovo gaming computer is on your holiday list, don’t shop around.
0:43:01 Just go directly to the source, Lenovo.com.
0:43:10 It’s your last chance to score exclusive deals on the gaming PCs you want, like the Lenovo Legion Tower 5 Gen 10 gaming desktop and Lenovo Lock gaming laptop.
0:43:19 So avoid all that shopping chaos and price comparing, and just go directly to the source, Lenovo.com, where PCs are up to 35% off.
0:43:20 That’s Lenovo.com.
0:43:23 Lenovo. Lenovo.
0:43:29 On Masters of Scale, iconic leaders reveal how they’ve beaten the odds.
0:43:33 Asking really strong questions is a superpower.
0:43:38 You want to show up with something radically different and how they’ve grown companies to incredible heights.
0:43:41 The greatest rewards always come from the greatest risks.
0:43:42 That’s hit the gas.
0:43:46 Airbnb, Zillow, Microsoft, Liquid Death, and more.
0:43:49 Hear from the founders who’ve changed the game.
0:43:51 It’s anything but business as usual.
0:43:59 Find Masters of Scale on Apple Podcasts, Spotify, YouTube, or wherever else you get podcasts.
0:44:04 This is an iHeart Podcast.
0:44:06 Guaranteed Human.

Today, we’re sharing an episode of a show that explores the problems that new technology is creating and how we navigate living in the future. It’s called Kill Switch, and it’s hosted by Pulitzer Prize-winning journalist, Dexter Thomas. The episode you’re about to hear is about the latest in wearable tech—stuff like smart glasses, pendants, watches and rings. After the implosion of Google Glass back in 2013, which faced backlash and ridicule, we’re now readily embracing wearables. What’s behind the new fervor of wearables today, and have we moved on from the privacy and surveillance questions that plagued Google Glass?

Dexter talks to Victoria Song, a senior reviewer at The Verge whose job it is to test out each new iteration of this technology, about the state of wearables today, why companies are obsessed with getting AI into them, and how they’ve already changed how we talk to each other, and ourselves, IRL. Find more episodes of Kill Switch wherever you get podcasts.

Got something you’re curious about? Hit them up killswitch@kaleidoscope.nyc, or @killswitchpod, or @dexdigi on IG or Bluesky.

See omnystudio.com/listener for privacy information.

Leave a Reply

What’s Your Problem?What’s Your Problem?
Let's Evolve Together
Logo