AI transcript
0:00:15 Hello, and welcome to the NVIDIA AI Podcast. I’m your host, Noah Kravitz. A quick note
0:00:22 before we welcome today’s guest. The AI Podcast has a new home on the web at ai-podcast.nvidia.com.
0:00:27 You can find all of our episodes there, as well as links to listen to the show on your
0:00:31 choice of podcast platforms. If you like what you hear, please take a moment to follow,
0:00:35 subscribe, or even leave us a review. And if we’re missing your favorite platform on that
0:00:42 page, or you just want to tell us something, drop a line at ai-podcast.nvidia.com. Thanks
0:00:45 for listening, and let’s get right to it. My guest today is working at the leading edge
0:00:51 of marketing intelligence. He’s got a fascinating backstory, his company does, and today they’re
0:00:57 using data-backed strategy and AI to help brands transform their marketing. Thomas Puig, founder
0:01:03 and CEO of Alembic, is here to discuss it all, and I’ve got just enough of a background in
0:01:08 marketing myself that hopefully I can, you know, carry my end of the conversation. We’ll see.
0:01:12 Tomas, welcome, and thank you so much for joining the NVIDIA AI Podcast.
0:01:14 Thank you for having me. Pleasure to be here.
0:01:19 So, I kind of hinted at it, but it’s always better coming from the guest than me. We try to do these
0:01:24 intros, but interesting story behind Alembic, and I’m sure I only know the tip of the iceberg there.
0:01:29 So, can you tell us what Alembic is and the story behind founding the company?
0:01:34 Yeah, so we’ve been around a little while. We’re really an applied science company, and so it took us
0:01:40 many years to build technology. It began with three people originally, and yeah, still with us today.
0:01:46 Myself and my background started at Ames Research Center, originally like when I was a kid, basically.
0:01:51 Went into quantitative economics, and then decided, actually, I prefer music, the arts, and marketing.
0:01:53 I went that route for quite a while until I ended up backwards.
0:01:54 Right.
0:01:59 The other founder was a guy named John Adams. John Adams, very storied infrastructure engineer in the
0:02:04 valley. He was the 13th employee at Twitter and took the company from the time it was a Mac mini
0:02:08 with a bad Ethernet cable under his desk all the way through the IPO.
0:02:11 I think for many years, he was the longest serving person not on the board of directors.
0:02:12 Wow. Okay.
0:02:19 And then Seth Little, who is a world-renowned creative director and designer who has rebuilt
0:02:24 brands for Lego and even done work for Apple, stuff like that over the period of time.
0:02:28 The three of us got together, and there were a number of reasons why we really chose this field.
0:02:34 But the most important is that we have felt that marketing and anything around the creative
0:02:40 and even the arts and trying to promote it had been, and no one had been able to be a storyteller
0:02:40 for years.
0:02:45 Everything had become so obsessed with trying to get that last little click, that last little
0:02:48 engagement, that last little performance, that we just about lost the plot.
0:02:56 And at the same time, I had a few deep beliefs about where things were going with technology that
0:03:01 married into that. And one of the things that, and this kind of brings it to there, is the
0:03:05 company really came out of a lot of the mathematics that were born during the pandemic.
0:03:06 Okay. How so?
0:03:10 A lot of people think MRNA is the only big tech to come out of the pandemic. The other thing
0:03:14 to come out was a lot of incredible math. It was really one of the first times that we
0:03:20 used huge-scale compute and modeling to actually be able to analyze something causally in real
0:03:25 time as an emergency was happening. We’d done weather and stuff before, right? But nothing
0:03:30 like that. And so we’re like, well, nuclear, weather, disease, drug discovery, everything
0:03:35 else is using these type of super compute modeling and deep learning. Why is everybody who does
0:03:38 creative, the arts, marketing, and everything else stuck with math in the 1970s?
0:03:44 So we’re a bit more unique in a company also as well, that we actually run our own private
0:03:47 cluster in our own cage. Like we physically own our hardware.
0:03:51 Oh, wow. Okay. You guys, I should have said at the beginning, you’re San Francisco-based?
0:03:53 San Francisco-based. Hardware’s North Virginia.
0:03:53 Okay.
0:03:57 And I’m assuming this is a very nerdy audience. And so I’m going to be like, we L2 patched straight
0:03:59 into AWS East 1 out of Equinix.
0:04:00 Yeah, go, go.
0:04:04 And so when we kind of found that the company, we sat there and we’re like, well, what would
0:04:08 it take? Well, the problem is, is that what it would take to actually do this was an entire
0:04:14 deep offshoot of a branch of mathematics. So one of the things we deeply believe, and we
0:04:19 talk about a lot, is that the profit at companies follows the flow of information.
0:04:22 Okay. So if you want to follow the money, everybody says follow the money, but really follow the
0:04:23 information.
0:04:25 Follow the information, you will find the money.
0:04:26 You’ll find the money.
0:04:26 Right.
0:04:32 Right. And what’s so important about that is that I believe all the alpha, all the profit
0:04:38 that will exist in corporations in the next while will all come from private data sets.
0:04:44 We are seeing major models and LL models. In fact, there was a paper just released where
0:04:48 they showed that models, when training on the same similar public data sets, end up more than
0:04:51 90% the same all the time, by the way, in nettings.
0:04:52 Okay.
0:04:57 We are seeing a convergence there. And so, and Jensen spoke to this actually more recently
0:04:59 too, where we will buy it like electricity.
0:05:00 Right.
0:05:00 Right.
0:05:05 But that means that it will also be converged and commoditized. The way I put it though, is
0:05:10 that these models are converging, right? And so as they converge, they will be the difference
0:05:12 between buying, say, BP and Shell gasoline.
0:05:13 Sure.
0:05:17 Right. No one has the same private data. Now a private data could be a songwriter writes his
0:05:22 own song. That is a private piece of thing that he wrote that is his, you know, what it
0:05:26 is. It could also be a giant corporation that has a huge corpus of data that nothing else
0:05:31 can see. And so I believe that within the next period of time, the thing that will generate
0:05:37 the most data will actually be lived human experiences. We’ll generate the most brand net new data.
0:05:38 Right.
0:05:38 Right.
0:05:41 I’m not saying that there’s not a ton of data that’s like derivative or whatever.
0:05:41 Sure.
0:05:47 Net new data. And so this could be, let’s take Disney as an example, fake example. Yeah. ESPN,
0:05:52 Disney plus, Hulu, the parks, magic bands, all the toys, everything else.
0:05:52 Sure.
0:05:53 Marvel.
0:05:53 A lot.
0:05:54 Star Wars.
0:05:55 A lot, a lot.
0:06:01 Monstrous, right? Like you’ve got a century of scripted IP. So all of this means that that
0:06:06 being able to take that and feed that and learn from it, you know, understand how it’s
0:06:09 structurally connected and then act upon it with agents and models of what the world’s
0:06:10 going to be like.
0:06:10 Right.
0:06:11 And that’s where the profits can come from.
0:06:16 Right. And so everybody, the more you can act on that with your data, the more it’s going
0:06:21 to inform what you can and perhaps should do and how you do it and everything.
0:06:24 Yeah. And so one of, you know, uh, the first company to kind of really do this in the
0:06:28 other days was I’m a huge fan of Renaissance technologies. They were the first high frequency
0:06:33 trading firm kind of out of New York for that mathematician. Right. And they really, you
0:06:38 saw this in the quant space first, then you saw Palantir try to do it in, they were really
0:06:42 trying to find a needle in a haystack, right? Like one actor out of a group. Right. Like
0:06:46 what really happens is you have this massive data. And so, you know, there’s a few reasons
0:06:51 I kind of see this already, you know, first look at open source models, deep seek, et cetera.
0:06:56 They’re getting close to matching the premium APIs in terms of performance and everything
0:07:02 else. Right. Second companies, these larger firms, you are seeing larger firms be a much
0:07:05 bigger part of this than the mobile revolution in the early days. Well, you have a couple
0:07:08 small companies like Kerser and everything that have a small number of employees that are pushing
0:07:12 it through or just being the large LLM stuff. There’s serious amounts of capital that are backed
0:07:17 by the large backers. Yeah. The private data has much higher prominence in the sub and they
0:07:22 have huge corpuses to apply this to. And the third is, is that anything that generates
0:07:28 derivative data, say you do something and there’s an action taken. It’s like compounding interest.
0:07:29 Right. It’s a flywheel.
0:07:34 Which is a cycle flywheel, just like you’re saying. And so it feeds upon itself. And so if we think
0:07:40 about like the 2010s, right? 2010s, the advantage went to the team to capture and operationalized
0:07:45 new data or audience fire hoses. You had the rise of Facebook, the rise of like these things.
0:07:52 2020s, where we’re seeing right now is who could stack the most GPUs. Hardware mattered so much,
0:07:56 right? And like you literally see teams being separated, the haps and have nots by the amount
0:08:01 of horsepower they have to apply to these things. Some of that’s being offset by innovation, right?
0:08:06 Deep seek, et cetera. But still to a large extent to serve that, you got to have the power and chips.
0:08:14 So I think 2030s, it’s going to be who turns their private data and its exhaust into a deep
0:08:20 learning asset is then applied against. How do you work with your customers? How do you help brands,
0:08:25 you know, start to tap into this data wherever they’re at in that process and do things with it?
0:08:29 Gain insights is what we always talk about, but you know, how do you help them?
0:08:33 So think about it like this. I’d say that two things to know about Alembic. One,
0:08:38 every single piece of data that ever comes into our system is anonymous aggregate data. We allow
0:08:44 no PII ever. There’s literally never been an incident of it in our system. Just like when you
0:08:48 think about where we really based off the biomedical math, right? You’re not, you don’t have PII there
0:08:52 either for like a third phase trial, right? You’re not tracking. Great. The second thing to know
0:08:58 is that we ingest enormous amounts of it. Yeah. We’ve brought in a hundred billion rows of data in
0:09:02 three days for clients. And this can be like every transaction from 17,000 stores,
0:09:07 stuff like that. The problem our clients have and anybody has is, and this is one thing that I find
0:09:14 very interesting about why you haven’t seen these whole like query, the database tools take off as
0:09:18 much as you’ve seen the generative tools take off. Because the problem actually is, you ever read
0:09:22 Hitchhiker’s Guide to the Galaxy? Long time ago. Yeah. So you get the very end of the book, right?
0:09:25 They’re like, what’s the mean of life, the interest and everything? Yeah. And they go, well,
0:09:30 the problem is we don’t know the question. Right. When you have that much data, you don’t even know
0:09:35 the question to ask. Yeah. To be able to analyze it. Or worse yet, you think you know the question
0:09:39 and you’re so far off. Then I’m down a rabbit hole. Oh, and then, and then you make a wrong
0:09:44 assumption, right? And it becomes a huge pain. And so what we have to do first is we have to ingest an
0:09:47 enormous amount of data and we have to signal process it. We actually use, we announced this
0:09:52 at GTC actually. The way we do signal processing and probably one of the coolest pieces of tech we have
0:09:57 is we actually use spiking neural networks to do it. That gives us a lot of superpowers. And the two
0:10:02 problems we had to solve for it were this. That’s why, why go build? You know, spiking neural networks
0:10:06 typically have been done neuromorphic hardware, which usually people think of wetware, like half
0:10:14 biological computers. Right. We wrote a simulator for the wetware as a kernel in NVIDIA, just like you
0:10:19 wrote a simulator for quantum. Yep. We wrote the simulator for neuromorphic on the hardware. And
0:10:24 that’s how we run an SNN. And so it’s actually fully custom to the NVIDIA chip. Obviously the answer
0:10:28 must be good, but how, how well does it perform? Really well, because the two problems that had to
0:10:34 solve were this. One, how do you compare apples and oranges? Right. Yeah. How is a Nielsen rating or
0:10:42 TV or a Spotify view the same as a phone visit or walking into a store, right? Different modalities,
0:10:47 different mediums, right? They all count as marketing, but exercises, touch points, something,
0:10:52 but yeah. Marketing has the worst job because their little job is everything. Right. Right. Second
0:10:56 reason it’s the worst job is everybody on earth thinks they’re a marketer. The second thing we had to
0:11:00 solve was marketing. You always hear about this. Oh yeah, we’ll do a campaign for a couple of weeks.
0:11:07 So you have no time history in our P ability, but how do you look for outliers with no time history?
0:11:12 So those two problems had to be solved. And SNNs were the solution for that. We turn everything into
0:11:17 spike signals, spike encodings. That’s all is the Apple’s. For people who don’t know one sentence
0:11:22 definition of what an SNN is. A spiking neural network is meant to be a digital twin of the human brain
0:11:27 to where it can actually process signals like neuron spikes, just like your brain would.
0:11:32 And what it’s doing is, is it seeing signal and then you’re firing neurons off a propensity.
0:11:36 There’s a lot more explanation to that. The reason that they’re really well-loved
0:11:41 is they’re incredibly fast and they are online as we call them. So that means that they are always that
0:11:46 and they’re evolving. So you don’t actually train them in advance. So they’re an evolving network.
0:11:52 So they really originally were really well-loved in people thought about that for IoT and they’re
0:12:01 fast. We use them to be able to find a way to detect outliers. So a human, a baby can pattern match from
0:12:07 the moment it’s born almost and teach it. So when you’re looking for outliers in data, normally what
0:12:11 we would do is we use prediction, grab all the previous data, see where the highest highs would be
0:12:16 predicted to be lowest lows and anything outside of that outlier. There’s other ways to do it too,
0:12:23 but for this audience, we’ll be overly redacted. With what we do is we’re actually looking at patterns
0:12:30 and actually seeing those changes from that neural morph, like the neural perspective in SNNs. And so
0:12:33 we can do outlier detections with no time history.
0:12:40 The reason this is important is say we actually, we presented at GTC, a case study with the CMO of Delta
0:12:44 and they sponsored the Olympics. And there’s a great recording of this. Actually, we did a session
0:12:49 and the problem is the Olympics happens for two weeks, three, two to four years.
0:12:57 If you do daily data, that’s 14 time steps. What do you do with that? So doing that signal processing,
0:13:02 first, we had to do that. The second thing is we actually had to figure out how to do the
0:13:07 connections, connect the chain of events. So say you watch the Olympics, then you Google for a Delta
0:13:11 flight, you click on a Google flights ad, and then you buy it. How do you connect those things
0:13:17 in time? So that’s where we ended up building all of our causal mathematics. So we use causal inference
0:13:21 and transfer information mathematics to be able to do that. So the first thing is we have to see the
0:13:27 signal at all. We build each time series with its own mini neural network. That neural network thing
0:13:30 gets chained together with causal links, and then we can build chain reactions.
0:13:33 Then you can mine the chain reactions for intelligence.
0:13:39 Okay. So let’s stay with that example then of Delta and the Olympics. A minute ago, you were talking
0:13:44 about, you know, with a client being able to ingest their data. And so I’m wondering, can you map like
0:13:48 these technologies you’re talking about in the spike neural networks? And you started talking about
0:13:53 causal AI, which is the other, other thing I wanted to ask you about. I need to hit it. What can you do
0:13:59 with the data? And how do these chain reactions? And, you know, how does it apply? What does the
0:14:02 client see? I don’t want to say at the end, but, you know.
0:14:06 You know, it’s funny. At the end, I actually believe that all dashboards should never exist.
0:14:12 Dashboards only exist to derive intelligence from them, right? Nobody wakes up in the morning and goes,
0:14:14 you know what I want in life? Another dashboard.
0:14:14 Yeah.
0:14:15 Yeah.
0:14:18 So what you do is you like look at the dashboard and then you write an email. So the way our
0:14:23 intelligence comes out is it actually comes out kind of like, uh, actually the bottle for it was
0:14:25 like the president of the U S gets an intelligence briefing every day.
0:14:30 Provide them as literal intelligence. Great. Yep. Oddly enough, we use LLMs at the company,
0:14:35 but not in the same way other people do. So there’s a game called Mad Libs. I’ll play with my kids.
0:14:40 It’s like you have a paragraph and then you have a blank and you fill in the blank and make a funny
0:14:45 sentence. That’s how we use LLMs. They never make a decision about data ever, but we do love them for
0:14:48 user experience. So they’ll write like kind of by the report, but then the cause of the chains will
0:14:54 actually just put in the actual data sets. And that means that, um, it kind of inoculates them
0:14:58 against the hallucination issue because everything that’s an actual real important data point. That’s
0:15:04 not like an is or an and or a connector is handled by a deeper method, right? A deep learning method.
0:15:08 And then we use the rest for user experience and communication. Oh, interesting. Those are derived
0:15:12 from the chains, but let’s take Delta as an example and talk about the Olympics. Sponsoring the Olympics
0:15:18 on national level is expensive. We’re talking eight figures at least. I think, uh, NBC reported
0:15:23 hundreds of millions, like maybe even a billion dollars in revenue off of the Olympics alone. I forget
0:15:29 what it was. I’d have to go look. It was massive. And when you’re doing that, there’s two types of
0:15:33 things that occur. One is you buy a whole bunch of 30 and 60 second ad spots, right? That you do.
0:15:38 The other is, is that you see this in sports all the time. You have things that are named after
0:15:43 companies. So if you watch the Olympics, you saw that the medal presentation ceremonies were Delta.
0:15:49 You know what sells a whole lot of tickets to Paris watching the Eiffel tower in the background in a
0:15:54 really emotional moment with a player as you put a gold medal on them. Yeah. That was actually more
0:15:59 effective than the ads themselves in some circumstances. Yeah, no, I, I can see it.
0:16:04 How do you connect that? Right. Somebody’s booking a flight. They’re doing some planning. There’s a delay
0:16:09 between those things. It’s not like you do it while you’re watching the TV. So you have to calculate the
0:16:14 known, we call it optimal lag, right? Between the time series, right? How do you calculate those
0:16:19 things? Right. And so what you do is you go, now we think about this like common sense. Of course I
0:16:23 saw that. And then someone’s going to buy something. It’s much harder to mathematically prove it.
0:16:28 You can’t even imagine. You know, we work with a finance organization where we actually look at
0:16:34 what causes ETF electronically traded fund flows. Conversions can be anything. You know what I mean?
0:16:38 You could be selling something or you could be trying to change the volume of something,
0:16:42 or you could actually care about more, the most open source contributors you get.
0:16:46 What we do is when we look at these chain reactions, we have nodes and edges as we call
0:16:49 them, right? Nodes are kind of the points like connect the dots. The edge is the connection
0:16:55 between them. We calculate every connection that could possibly exist as we build stuff.
0:16:55 Right.
0:17:01 So then we can dynamically search the intelligence afterwards. And so we can literally morph our
0:17:04 reports based off what the user wants to see. So if you’re like, I want to know all the
0:17:09 things that are about selling the ticket. And the next day you’re going to know, I want to know all
0:17:13 the things that are about game frequent flyers. All you do is change the focus. It redoes the search
0:17:18 in real time instantaneously. And so that type of stuff, what I find really interesting is a lot of
0:17:23 people are working in deep learning right now. The reason why we have such a sticky in these huge,
0:17:29 big enterprise customers is, and we’re not a cheap system, is that they derive real value out of it.
0:17:32 And also we understand that we have to meet them where they are.
0:17:34 How do you mean meet them where they are?
0:17:38 Well, I find that a lot of people are talking about AI and form factor. Should it be hardware?
0:17:43 Should it be software? How should it speak? Everything else. But I never hear anybody says,
0:17:49 is the thing that outputs actually useful? You know, I believe that actually we’re having a paradigm
0:17:54 shift in the entire thing. For the longest time, there were a ton of studies that said 90% of the world
0:18:01 was consumers. 9% was curators and 1% was creators. And this actually held true for pretty much like as
0:18:06 long as people could remember. But now suddenly with AI, the curators are actually creators.
0:18:07 Right.
0:18:14 So the 1% is now 10%. So you’ve 10x the footprint of people who are wanting to build things.
0:18:18 And the strategic monstrous shift that occurs with that is large.
0:18:23 The way you put that, it reminds me of, I mean, it reminds me of the 2000s and the web first,
0:18:27 you know, coming out and anybody could publish if you wanted to.
0:18:32 You know what reminds me of the most is actually when you could start recording your own record in
0:18:33 your living room.
0:18:34 Right.
0:18:38 It used to be, you had to have these ginormous studios. We see them in films, right? And everything
0:18:42 else. And then you have people like Beck and everything being like, I wrote a hit record on my
0:18:48 track. Yeah. And the democratization of that, I don’t think anyone would say music is worse off
0:18:52 because of it. The music industry, the monetization may have suffered, but the actual quality of the
0:18:57 music and the stuff coming out, I wouldn’t say that you’ve had a, I’d say we have access to more
0:18:59 independent music than we have ever had.
0:19:03 Oh, for sure. No, the difference between how my kids get music and how I did, you know,
0:19:05 couldn’t be, couldn’t be more different.
0:19:10 So like, kind of like what we talk about, like suddenly let’s take code in this example. We use a
0:19:16 lot of AI to write code now and this is everybody else, but code used to be, or is in a lot of
0:19:21 places like a bespoke thing, right? Like it will be the most beautiful furniture ever. And somebody’s
0:19:28 hand crafting this beautiful object or before the assembly line existed for Ford, right? Any motor
0:19:33 vehicle would be like handcrafted by every single little individual thing. But now it’s no longer
0:19:39 bespoke. Now you have this sense of like, you actually have mass crafted, mass generated objects
0:19:44 in the beginning. I’m sure mass generated cars did not equal the bespoke cars. Of course.
0:19:48 Nowadays, I would say probably most of the mass generated cars are probably better than this
0:19:52 bespoke cars from a safety perspective. Code is doing the same thing. And so now you have two
0:19:58 fronts. You have everything’s able to be like across this information. And the second thing is
0:20:04 the curatorial class, right? Those with taste are now creators by default, right?
0:20:11 The playlist is a Veblen good something, right? Like, um, now I love tools that give people agency to
0:20:19 control themselves. I am deeply worried long-term about access because that can be hard, but I think
0:20:23 about the number of musicians who never would have had anyone to hear them at all. If a four track
0:20:28 didn’t exist. Yeah. I think about the number of writers or filmmakers or brilliant people who are never
0:20:33 going to be heard at all, if they did not have the democratization of tools. When we bring this back
0:20:38 around to marketing, attention, everybody’s attention is really what there is, is there’s only, it’s a
0:20:43 supply side limited. There’s only so much attention in the world. There’s only like for so long, every
0:20:47 company in the entire world is competing for the same slice of attention at the end of the day.
0:20:52 The only commodity that can never be increased is a person’s heartbeats. How many till the day you die,
0:20:56 then you can’t sacrifice. So when you’re doing that marketing and everything else, and we’re trying to do
0:21:00 that, we’re trying to understand the universe and get down to it. Now we’ve got a little like
0:21:04 theoretical here, but I think it’s an important paradigm shift that we’re discussing because
0:21:10 you can’t separate at the end of the day marketing from the rest of the businesses it governs.
0:21:17 So, so bring it back down sort of to, you know, business us and bolts level. How is AI, I mean,
0:21:23 as much as you can, and you know, be specific because it’s so many ways, but what are some of the biggest
0:21:29 ways you see me is better way to ask it that AI is already changing how brands relate to their, their audiences
0:21:36 and, you know, what’s happening kind of just down the road. I think this idea of, and it’s funny, as I was
0:21:42 prepping to talk to you, I was thinking about, we’ve done a lot of episodes on health and medicine lately, and this
0:21:50 idea of precision medicine, right? And being able to offer, you know, each individual, the individualized care,
0:21:57 preventative care, you know, it’s technology enabling us to provide that level of personalization.
0:22:04 And so there’s a similar thing in some ways, as I understand it with marketing, and I mean, to really
0:22:08 probably not do it justice, but I break it down in my head to cookies following me around.
0:22:14 And that being kind of a, you know, to me seems like a brute force, maybe outdated at this point
0:22:19 way of doing it, but what’s AI doing for all this? And what’s it going to be doing for, for brands and
0:22:21 engagement and personalization?
0:22:28 You take a little bit of an anti stance here. Human beings love building tribes and community and the
0:22:34 most visceral experiences that they have. Let’s take sports. People will often say sports is the new
0:22:39 church. It’s not my phrase. It’s just what everybody, you know, you want that community, whether that is
0:22:45 in that format or it’s in the sports format or something else, that means that you need everybody
0:22:52 to experience the same thing and be able to respond to it. And if you think about the fondest experiences
0:22:58 you can remember, almost, almost always are there with somebody else, right? Remember doing thing with
0:23:03 somebody or how it affected somebody or how it did something. And so while I believe that personalization
0:23:11 is interesting, I actually believe that being able to really build these experiences that people have
0:23:16 and to be able to bring people closer together, those are the marketers that are going to win.
0:23:17 Okay.
0:23:22 Mathematically, I talk about this a lot where you’re like, what does an LLM mostly do? It predicts the
0:23:30 next best token. Inherently, that next best token is the one you would expect. That’s what’s
0:23:35 supposed to do. That’s right. Kind of in a weird way, like kind of like that’s kind of like it’s
0:23:39 going to give you the median, right? It’s going to give you the answer you should have. But what
0:23:43 great artist, what great marketer wakes up in the morning and goes, you know what I want to be? The
0:23:49 median. And I don’t know, I can really feel, it’s gotten better, right? But I can still feel the
0:23:55 aesthetic of AI imagery and everything that comes out because it is still trying to, it is almost going
0:24:01 to have its own aesthetic. And so I think that what will happen is, is that the uniqueness of
0:24:07 experiences and everything else and the premium of that will rise. And that we will have this kind
0:24:11 of push against where it’s not like all about personalization forever. Everybody’s going to
0:24:14 realize you still are going to want to experience that with other stuff.
0:24:15 Interesting. Yeah.
0:24:19 And so for maybe 10% of your life, you’re going to want to hyper-personalize,
0:24:22 like I want to talk one-on-one with it, right? Or maybe it’s an ephemeral thing. You’re
0:24:29 using, I heard a great founder talk recently about how he uses ChatGPT and he writes a story
0:24:34 with his kid at bedtime and he just doesn’t keep it. It’s audio. They’re just playing with it. That’s
0:24:39 ephemeral. But you don’t go to the water cooler the next day or get online and talk on Discord with
0:24:45 your buddy about that. And so I feel like often human nature is lost within this. So marketers,
0:24:50 our job is to create those experiences. And then when it’s a touch point for preference that you’ve
0:24:56 already created, then try to enhance it with some personalization. But I feel like people have lost
0:24:57 the plot a bit.
0:25:02 Yeah. Interesting. I don’t disagree. I just, it’s interesting to think about that in the context of,
0:25:08 as I said, all the personalization that jumps to mind when thinking about these types of things
0:25:10 or the potential personalization.
0:25:13 Well, I always think about personalization is once you’ve engaged with the brand,
0:25:16 how does the brand find the optimal path for you?
0:25:16 Yeah.
0:25:17 Deeply important.
0:25:18 Yeah.
0:25:24 If I’m already an Amazon customer, I need the happy path. If I am an Apple customer,
0:25:30 what iPhone do I buy? If I’m, I have some friends who love Alice and Olivia dresses. They’ve got a ton
0:25:33 of different dresses. Which one should they get? Right? They’re going to have preferences within that.
0:25:36 But I’m talking about like, how do you even get them to like the designer?
0:25:39 Right. You have that shared moment. Yeah.
0:25:46 Profit and the margin comes from brand, but what is the value of brand? How do you actually price that?
0:25:51 All right. So I’m going to flip this on you and it’s not even flipping. I was, I was thinking of
0:25:56 an excuse to, to play off of what you just said. How do we value the human in all this going forward
0:26:03 to go to the other side of the process and think about the creator, the, the ad man and woman,
0:26:08 if you will, whoever’s on, you know, the side of, is it curating? What’s, what’s the role for the
0:26:16 human for human creativity? How does that evolve as people in marketing, people in media, whatever part
0:26:20 of the process, you know, you’re working on, you’re, you’re overseeing, you’re collaborating on, like,
0:26:25 how does that role change? And, and years ago, and it’s crazy that I can say years ago, referring to
0:26:30 this podcast, but we can now, uh, we had somebody on, I don’t want to misrepresent his title, but a
0:26:35 creative director basically working in the game, video games industry and talking about how generative AI,
0:26:40 AI, I think the title of the podcast, something about making zombie armies with Gen AI, something
0:26:45 to that effect and talking about how Gen AI was enabling this, you know, and, and it’s this metaphor
0:26:51 that we keep using. Think of your co-pilot as like an intern, right? A highly capable intern who needs
0:26:56 a lot of direction, instruction, learning the ropes, but they can generate good work if, you know,
0:27:01 correctly prompted, if you will. But he was talking about the ability for one person to kind of give
0:27:08 creative director level instructions and have the AI, you know, fill in the rest of the zombie army
0:27:14 from a couple of examples or generate landscape or, you know, that kind of a thing. And so when you
0:27:15 said, you know, curators.
0:27:16 No, there’s a guy named Andy Warhol.
0:27:18 Right? I’ve heard of him, yeah.
0:27:21 Right? Andy Warhol did not paint all his pieces. How’s it different?
0:27:25 Well, I don’t know how to draw the metaphor, but, because I’m sure Andy Warhol had lots of people
0:27:31 working for him as well. But what happens to the human when, if I can generate, you know,
0:27:37 the work of five people in a creative setting, does the role become for, instead of being a
0:27:46 writer who learned how to manage creative teams, I’m just more of a, you know, creative team manager
0:27:48 from the get-go and some of my team is AI or?
0:27:53 I think sometimes we get so theoretical about this stuff. You know, I had a terrible punk band
0:27:55 as a teenager. We were awful.
0:27:57 Nice. What are you called?
0:28:01 Then there were three. We had four guys, and then we ended up with three of them. And so
0:28:02 we just called the band, and then there were three.
0:28:03 You’re a Genesis fan?
0:28:04 Oh, yeah, yeah.
0:28:05 Okay.
0:28:06 Yeah, it’s the whole joke.
0:28:09 That’s like, there’s no way, there’s no way you’re not.
0:28:14 I know. So the funny thing about all this is, I think back to then, right? If I try to put
0:28:19 myself in my young shoes, right? 15. All I want to do is make cool stuff.
0:28:20 Totally.
0:28:24 Ezra Klein gives a great talk about this, where he says, for those that become creatives,
0:28:29 like, all you have in the early days is taste. And the hardest part about taste is when you do
0:28:36 something, you suck. But you know you suck in the early days. And so you have to fight through all of
0:28:45 the hundreds and thousands and awful times to then generate a style and a body of work. And I think
0:28:53 about how incredibly demotivating awful that was at the time. And what it will do is for the people
0:28:59 starting out, it will create great agency. And they will be able to do incredibly cool things. And they
0:29:04 will get 80% of the way there. But I also think it’s going to create this weird, almost even harder
0:29:07 field to get through, where you’re going to be stuck there way longer.
0:29:13 At that sort of glass ceiling of understanding that there’s better work out there. I just don’t know
0:29:13 how to create it.
0:29:16 Yeah. Because I mean, you’re not failing as much.
0:29:17 Right.
0:29:23 Right. And so for me, the interesting conversation about this the most is what happens to us as humans
0:29:26 when we don’t fail enough to improve?
0:29:27 Right.
0:29:34 Because failure is what makes us improve. And man, I do things the hard way so many times.
0:29:34 Yeah.
0:29:39 Right. Even today, like, I’ll do things where I’m like, did I really do that? But I have a son who’s
0:29:44 seven. And I’ve watched him, like, struggle and get through and learn things and build them. We’re
0:29:46 building Gundam together now. There’s Japanese robots.
0:29:46 Oh, nice. Yeah.
0:29:50 First one he built, he didn’t get it. He accidentally chopped off a part, right? Now he can build one by
0:29:51 himself. He’s seven.
0:29:52 Amazing.
0:29:55 And so what happens when failure ceases to exist?
0:30:02 Does the ability to so quickly generate so many more takes, do you think that desensitizes us to
0:30:02 failing?
0:30:05 I think it means that you’re not going to learn from the failures as much. You’ll learn different
0:30:08 things, right? You’ll learn how to manipulate the AI. We’ll become experts in AI manipulation.
0:30:13 It’s just an interesting thing. Do those small failures where they’re not really failures count?
0:30:17 Will it have the same effect? I actually don’t know the answer to this. I just think it’s
0:30:21 the most interesting component of what you mentioned from that basis.
0:30:26 Right. If I can take, I was, it’s showing my age, but my mind for some reason jumped to
0:30:34 digital photography. And, you know, that move from every photo I take literally costs me,
0:30:40 you know, a roll of film divided by X exposures to, and worse yet than the monetary cost is once
0:30:45 my rolls out. You know, I got the rest of my day in Paris and I can’t take pictures to, you know,
0:30:50 phone in my pocket. It’s all digital. Take a whole bunch, pick the best one, you know,
0:30:51 delete the others.
0:30:56 Yeah. And I mean, you can see the difference it had on, you can absolutely see the difference it had
0:31:03 on what photography looks like. Now, you know, you go from Ansel Adams, hyper-composed, or these
0:31:09 beautiful New York style Vogue shoots, right? Where lighting was king and we end up in more
0:31:13 of a street photography style, much more naturalistic style. I don’t even know if one’s worse than the
0:31:14 other, right? I actually like both.
0:31:14 Yeah.
0:31:17 But it absolutely will have an effect on the world.
0:31:17 Yeah.
0:31:23 And actually, I think that what did do though, is I think it is much harder to make a living as a
0:31:23 photographer.
0:31:24 Right.
0:31:29 You know, and I don’t know what the answer is here, you know, Andreessen, or what was it? Andreessen
0:31:32 or Horowitz, I can’t remember, one of the two of them used to give a talk where they were like,
0:31:38 innovation is inherently destructive often, right? Records got rid of in-home musicians.
0:31:38 Right.
0:31:45 In symphonics, washing machines displaced, house workers, stuff like that. But in the creative
0:31:49 side of the space, I’ve never seen humans be less creative just because they’re not paid for it.
0:31:52 We’re talking about how people are getting paid less to be a photographer and everything. We’re
0:31:53 not seeing less photographers.
0:31:53 Right.
0:31:56 This is what’s hard about subjects like this. We’re talking about something that’s both
0:31:58 business and art.
0:31:59 Exactly. Yeah.
0:32:04 But let’s kind of bring it back to like, kind of a couple things. Just Alembic and marketing and stuff.
0:32:05 Yeah. Yeah. Okay. Cool.
0:32:10 So say you have all the spiky neural networks, right? Say you have the causal, you can build
0:32:15 the chain reactions. The key with what you have to do then, which I think kind of brings
0:32:18 it into this, is you have to make it understandable to a human.
0:32:18 Of course.
0:32:18 Right.
0:32:23 And you have to be able to take action on it. And we talk about this in my company a lot. Every
0:32:28 human has their own superpowers and specialties. I don’t expect everybody to be great at data and
0:32:32 math. I expect some people to be incredible at EQ and be able to keep the office together.
0:32:32 Right.
0:32:33 Yeah.
0:32:33 Stuff like that.
0:32:34 Shout out to those people.
0:32:39 Seriously. When we’re doing this, you have to have the system meet people where they’re at.
0:32:41 And that’s where that last mile comes in.
0:32:41 Yeah.
0:32:45 We take the data and instead of being like, I want to find a needle in a haystack, like
0:32:49 your security company being like, I need to find the one bad actor. We have to surface all
0:32:55 the data, have it make sense and let it help them accomplish their goal. And what that means
0:33:00 we have to do monstrous. We have to refine this, like your deep learning, this corpus of monstrous
0:33:04 amounts of disparate data. That’s pseudo structured at best.
0:33:11 And be able to pull insights out of it. And with those insights, then be able to let people
0:33:16 meet the goals of their group. And I think that that in general with marketing is when you think
0:33:20 about it, you’re being like, if I can see everything, if I can see all the creative stuff I’m doing,
0:33:25 even the cool stuff, like I did a cool activation app pop up in New York and I can treat that the
0:33:26 same as I did a Google ad.
0:33:26 Right.
0:33:27 Right.
0:33:31 Right now one’s ignored. Those ones that will work, those cool pops that work, we’ll get credit
0:33:35 and we will get more cool things. Right. Whereas otherwise we’re just going to get
0:33:40 more just a whatever, right? Just some random, just simple thing. You like buy this thing.
0:33:46 When we talk about like creatives and business and everything, great information is always good
0:33:51 for buyer and seller. And so quality information is a plus because if companies see, and I’ve talked
0:33:55 to every CMO on earth, right? Chief marketing officer on earth. They said, if companies see that somebody
0:34:00 who’s creative is making me $2 for every dollar I spend, I’m going to spend as many dollars
0:34:05 with that human as possible. Whereas if you go, well, yeah, we did that cool pop up, but
0:34:09 I can’t see what that creator did. Right. Yeah. Why’d you pay the premium? Right. We’re going
0:34:14 to spend on vibes. Yeah. I think that like AI and everything, as we talk about nowadays,
0:34:20 can be wonderful and actionable and push ahead and provide real value for people where you could
0:34:25 be like, I know for every $12 I spend here, I should sponsor the NBA. I should sponsor the
0:34:29 WNBA because it makes me money. Right. Actually, I should be paying the WNBA more money because
0:34:33 then they can do more for me and then do stuff. Right. It creates quality across the data sets
0:34:37 because the things that get ignored are the outliers or the smaller things because they
0:34:42 can’t pick up enough signal. Right. We want to get all the signal. Yeah. Do it. And then
0:34:47 you can distribute correctly. Makes a lot of sense. So we’ve been ending the shows recently asking
0:34:52 the guests question, but I kind of want to ask you a variant of it. Question is what tools,
0:34:57 what AI tools are you using lately that, you know, you really like or might recommend. But I think
0:35:03 in this case, I mean, feel free, but also for somebody who’s out there who’s whatever, whatever
0:35:09 age of their life, but they’re newish to marketing or maybe they’ve been in marketing for a while,
0:35:16 but they’re new to getting a handle on AI and Gen AI and like how to actually start using it in a way
0:35:20 that, you know, can be constructive to the work they’re doing, their career development, that kind
0:35:25 of thing. Tools that you’d suggest, a book that you’d suggest, some resource you might suggest
0:35:32 to somebody out there who wants to, you know, get hands and brain on with AI and marketing right now.
0:35:37 All right. I’ll give two versions. I’ll give the, um, I’m just getting into it and I’ll give the,
0:35:39 I’m the giant advanced nerd. Fantastic.
0:35:46 So for beginners, what I always recommend for any of this stuff is actually, you literally can just
0:35:52 start with chat GPT. Sure. What I will say about the best of this is plan before you create. And
0:35:58 the number one thing that people forget to do is be like, tell the system, you may ask clarifying
0:36:04 questions. Say, I want to plan, say anything you like and be like, please ask clarifying questions.
0:36:07 Yeah. And then we’ll just start talking back to you and you two will make the best plan on earth
0:36:12 and act. You can do this on how to learn. You could do this on how to do stuff, anything.
0:36:18 And I think people forget that, um, they want a declarative statement, but it can be a conversation.
0:36:20 Right. That’s good. That’s good advice. Yeah.
0:36:26 And, uh, it’s the number one thing I teach myself. The second thing is, is, and this sounds really
0:36:30 silly for beginners. If you’re doing a long conversation, like a big, long prompt, put the
0:36:36 instructions at the very top and at the very bottom. Yeah. Yeah. Absolutely. When things long
0:36:40 enough, you need to put in both places. There’s going to lost studies on this. Those two things
0:36:44 are probably the number two things that can get you 80% of the way there for improvement and just
0:36:50 explore. The second thing is find voices you like, right? Whether that’s Scott Bollinger, whether that’s
0:36:56 Gary V, whether that’s anything, and you can just use those voices to offset, right? You do want to talk
0:37:02 to actual humans for the super advanced folks. One thing I’ll recommend is choose your top 10 data
0:37:08 scientists, whether it’s Lacoon, whether it’s whatever, and tweak your algorithms, go to LinkedIn
0:37:13 and like the last 10 of their posts that are only technical, go feed and train the thing to give you
0:37:18 everything you want. I have mine trained to only give me, you know, papers from the archives and like
0:37:23 cool algorithms like that. And I actually find it very convenient. Yeah, I bet. And so I know people get
0:37:26 really frustrated with that type of stuff, but it’s the absolute best way to get the largest footprint
0:37:34 quickly. The second thing I will recommend is that please go look at other disciplines. People get
0:37:38 very myopic. We’re starting to see people do diffusion models right now for LLMs when we were
0:37:43 doing all transformer. I am never precious about where I get methodology from. I’m only precious about
0:37:49 what the result is. Yeah. And so I think that the breadth of learning, the curiosity is the number one
0:37:55 thing I feel like people are doing very much. This is my team lately. My LLM that I like to use. This
0:38:01 is my algorithm. This thing. And I’m like, it’s all hammers and nails, guys. Right. Go build something.
0:38:07 Yep. And so, you know, Ben Franklin, when he discovered the form of electricity, wasn’t trying
0:38:11 to discover electricity. He was trying to invent the lightning rod so that all the houses wouldn’t burn
0:38:18 down. So, like, go explore. Those are the two things I would say. Excellent. Tomás Puig, Alembic. For
0:38:23 listeners who would like to learn more, website, where can they go online to learn more about Alembic,
0:38:28 the work you’re doing? Getalembic.com, right? Yeah. Nice little corporate-y site there. We are very
0:38:33 friendly. And then also, we are often at several of the industry conferences you’ll get at, Gardner,
0:38:39 NVIDIA, Forrester, that type of stuff. Always feel free to say hi to us. Fantastic. Tomás,
0:38:43 thank you so much for your time. It’s been a, really, it’s been a pleasure talking to you. It’s
0:38:47 fun to kind of get into the theoretical. Yeah, yeah. If you’re ever up in a city, drop by the office or
0:38:51 wherever and say hello. I’d love to chat with you more. It was just an interesting conversation.
0:39:34 Thank you.
0:39:41 Thank you.
0:00:22 before we welcome today’s guest. The AI Podcast has a new home on the web at ai-podcast.nvidia.com.
0:00:27 You can find all of our episodes there, as well as links to listen to the show on your
0:00:31 choice of podcast platforms. If you like what you hear, please take a moment to follow,
0:00:35 subscribe, or even leave us a review. And if we’re missing your favorite platform on that
0:00:42 page, or you just want to tell us something, drop a line at ai-podcast.nvidia.com. Thanks
0:00:45 for listening, and let’s get right to it. My guest today is working at the leading edge
0:00:51 of marketing intelligence. He’s got a fascinating backstory, his company does, and today they’re
0:00:57 using data-backed strategy and AI to help brands transform their marketing. Thomas Puig, founder
0:01:03 and CEO of Alembic, is here to discuss it all, and I’ve got just enough of a background in
0:01:08 marketing myself that hopefully I can, you know, carry my end of the conversation. We’ll see.
0:01:12 Tomas, welcome, and thank you so much for joining the NVIDIA AI Podcast.
0:01:14 Thank you for having me. Pleasure to be here.
0:01:19 So, I kind of hinted at it, but it’s always better coming from the guest than me. We try to do these
0:01:24 intros, but interesting story behind Alembic, and I’m sure I only know the tip of the iceberg there.
0:01:29 So, can you tell us what Alembic is and the story behind founding the company?
0:01:34 Yeah, so we’ve been around a little while. We’re really an applied science company, and so it took us
0:01:40 many years to build technology. It began with three people originally, and yeah, still with us today.
0:01:46 Myself and my background started at Ames Research Center, originally like when I was a kid, basically.
0:01:51 Went into quantitative economics, and then decided, actually, I prefer music, the arts, and marketing.
0:01:53 I went that route for quite a while until I ended up backwards.
0:01:54 Right.
0:01:59 The other founder was a guy named John Adams. John Adams, very storied infrastructure engineer in the
0:02:04 valley. He was the 13th employee at Twitter and took the company from the time it was a Mac mini
0:02:08 with a bad Ethernet cable under his desk all the way through the IPO.
0:02:11 I think for many years, he was the longest serving person not on the board of directors.
0:02:12 Wow. Okay.
0:02:19 And then Seth Little, who is a world-renowned creative director and designer who has rebuilt
0:02:24 brands for Lego and even done work for Apple, stuff like that over the period of time.
0:02:28 The three of us got together, and there were a number of reasons why we really chose this field.
0:02:34 But the most important is that we have felt that marketing and anything around the creative
0:02:40 and even the arts and trying to promote it had been, and no one had been able to be a storyteller
0:02:40 for years.
0:02:45 Everything had become so obsessed with trying to get that last little click, that last little
0:02:48 engagement, that last little performance, that we just about lost the plot.
0:02:56 And at the same time, I had a few deep beliefs about where things were going with technology that
0:03:01 married into that. And one of the things that, and this kind of brings it to there, is the
0:03:05 company really came out of a lot of the mathematics that were born during the pandemic.
0:03:06 Okay. How so?
0:03:10 A lot of people think MRNA is the only big tech to come out of the pandemic. The other thing
0:03:14 to come out was a lot of incredible math. It was really one of the first times that we
0:03:20 used huge-scale compute and modeling to actually be able to analyze something causally in real
0:03:25 time as an emergency was happening. We’d done weather and stuff before, right? But nothing
0:03:30 like that. And so we’re like, well, nuclear, weather, disease, drug discovery, everything
0:03:35 else is using these type of super compute modeling and deep learning. Why is everybody who does
0:03:38 creative, the arts, marketing, and everything else stuck with math in the 1970s?
0:03:44 So we’re a bit more unique in a company also as well, that we actually run our own private
0:03:47 cluster in our own cage. Like we physically own our hardware.
0:03:51 Oh, wow. Okay. You guys, I should have said at the beginning, you’re San Francisco-based?
0:03:53 San Francisco-based. Hardware’s North Virginia.
0:03:53 Okay.
0:03:57 And I’m assuming this is a very nerdy audience. And so I’m going to be like, we L2 patched straight
0:03:59 into AWS East 1 out of Equinix.
0:04:00 Yeah, go, go.
0:04:04 And so when we kind of found that the company, we sat there and we’re like, well, what would
0:04:08 it take? Well, the problem is, is that what it would take to actually do this was an entire
0:04:14 deep offshoot of a branch of mathematics. So one of the things we deeply believe, and we
0:04:19 talk about a lot, is that the profit at companies follows the flow of information.
0:04:22 Okay. So if you want to follow the money, everybody says follow the money, but really follow the
0:04:23 information.
0:04:25 Follow the information, you will find the money.
0:04:26 You’ll find the money.
0:04:26 Right.
0:04:32 Right. And what’s so important about that is that I believe all the alpha, all the profit
0:04:38 that will exist in corporations in the next while will all come from private data sets.
0:04:44 We are seeing major models and LL models. In fact, there was a paper just released where
0:04:48 they showed that models, when training on the same similar public data sets, end up more than
0:04:51 90% the same all the time, by the way, in nettings.
0:04:52 Okay.
0:04:57 We are seeing a convergence there. And so, and Jensen spoke to this actually more recently
0:04:59 too, where we will buy it like electricity.
0:05:00 Right.
0:05:00 Right.
0:05:05 But that means that it will also be converged and commoditized. The way I put it though, is
0:05:10 that these models are converging, right? And so as they converge, they will be the difference
0:05:12 between buying, say, BP and Shell gasoline.
0:05:13 Sure.
0:05:17 Right. No one has the same private data. Now a private data could be a songwriter writes his
0:05:22 own song. That is a private piece of thing that he wrote that is his, you know, what it
0:05:26 is. It could also be a giant corporation that has a huge corpus of data that nothing else
0:05:31 can see. And so I believe that within the next period of time, the thing that will generate
0:05:37 the most data will actually be lived human experiences. We’ll generate the most brand net new data.
0:05:38 Right.
0:05:38 Right.
0:05:41 I’m not saying that there’s not a ton of data that’s like derivative or whatever.
0:05:41 Sure.
0:05:47 Net new data. And so this could be, let’s take Disney as an example, fake example. Yeah. ESPN,
0:05:52 Disney plus, Hulu, the parks, magic bands, all the toys, everything else.
0:05:52 Sure.
0:05:53 Marvel.
0:05:53 A lot.
0:05:54 Star Wars.
0:05:55 A lot, a lot.
0:06:01 Monstrous, right? Like you’ve got a century of scripted IP. So all of this means that that
0:06:06 being able to take that and feed that and learn from it, you know, understand how it’s
0:06:09 structurally connected and then act upon it with agents and models of what the world’s
0:06:10 going to be like.
0:06:10 Right.
0:06:11 And that’s where the profits can come from.
0:06:16 Right. And so everybody, the more you can act on that with your data, the more it’s going
0:06:21 to inform what you can and perhaps should do and how you do it and everything.
0:06:24 Yeah. And so one of, you know, uh, the first company to kind of really do this in the
0:06:28 other days was I’m a huge fan of Renaissance technologies. They were the first high frequency
0:06:33 trading firm kind of out of New York for that mathematician. Right. And they really, you
0:06:38 saw this in the quant space first, then you saw Palantir try to do it in, they were really
0:06:42 trying to find a needle in a haystack, right? Like one actor out of a group. Right. Like
0:06:46 what really happens is you have this massive data. And so, you know, there’s a few reasons
0:06:51 I kind of see this already, you know, first look at open source models, deep seek, et cetera.
0:06:56 They’re getting close to matching the premium APIs in terms of performance and everything
0:07:02 else. Right. Second companies, these larger firms, you are seeing larger firms be a much
0:07:05 bigger part of this than the mobile revolution in the early days. Well, you have a couple
0:07:08 small companies like Kerser and everything that have a small number of employees that are pushing
0:07:12 it through or just being the large LLM stuff. There’s serious amounts of capital that are backed
0:07:17 by the large backers. Yeah. The private data has much higher prominence in the sub and they
0:07:22 have huge corpuses to apply this to. And the third is, is that anything that generates
0:07:28 derivative data, say you do something and there’s an action taken. It’s like compounding interest.
0:07:29 Right. It’s a flywheel.
0:07:34 Which is a cycle flywheel, just like you’re saying. And so it feeds upon itself. And so if we think
0:07:40 about like the 2010s, right? 2010s, the advantage went to the team to capture and operationalized
0:07:45 new data or audience fire hoses. You had the rise of Facebook, the rise of like these things.
0:07:52 2020s, where we’re seeing right now is who could stack the most GPUs. Hardware mattered so much,
0:07:56 right? And like you literally see teams being separated, the haps and have nots by the amount
0:08:01 of horsepower they have to apply to these things. Some of that’s being offset by innovation, right?
0:08:06 Deep seek, et cetera. But still to a large extent to serve that, you got to have the power and chips.
0:08:14 So I think 2030s, it’s going to be who turns their private data and its exhaust into a deep
0:08:20 learning asset is then applied against. How do you work with your customers? How do you help brands,
0:08:25 you know, start to tap into this data wherever they’re at in that process and do things with it?
0:08:29 Gain insights is what we always talk about, but you know, how do you help them?
0:08:33 So think about it like this. I’d say that two things to know about Alembic. One,
0:08:38 every single piece of data that ever comes into our system is anonymous aggregate data. We allow
0:08:44 no PII ever. There’s literally never been an incident of it in our system. Just like when you
0:08:48 think about where we really based off the biomedical math, right? You’re not, you don’t have PII there
0:08:52 either for like a third phase trial, right? You’re not tracking. Great. The second thing to know
0:08:58 is that we ingest enormous amounts of it. Yeah. We’ve brought in a hundred billion rows of data in
0:09:02 three days for clients. And this can be like every transaction from 17,000 stores,
0:09:07 stuff like that. The problem our clients have and anybody has is, and this is one thing that I find
0:09:14 very interesting about why you haven’t seen these whole like query, the database tools take off as
0:09:18 much as you’ve seen the generative tools take off. Because the problem actually is, you ever read
0:09:22 Hitchhiker’s Guide to the Galaxy? Long time ago. Yeah. So you get the very end of the book, right?
0:09:25 They’re like, what’s the mean of life, the interest and everything? Yeah. And they go, well,
0:09:30 the problem is we don’t know the question. Right. When you have that much data, you don’t even know
0:09:35 the question to ask. Yeah. To be able to analyze it. Or worse yet, you think you know the question
0:09:39 and you’re so far off. Then I’m down a rabbit hole. Oh, and then, and then you make a wrong
0:09:44 assumption, right? And it becomes a huge pain. And so what we have to do first is we have to ingest an
0:09:47 enormous amount of data and we have to signal process it. We actually use, we announced this
0:09:52 at GTC actually. The way we do signal processing and probably one of the coolest pieces of tech we have
0:09:57 is we actually use spiking neural networks to do it. That gives us a lot of superpowers. And the two
0:10:02 problems we had to solve for it were this. That’s why, why go build? You know, spiking neural networks
0:10:06 typically have been done neuromorphic hardware, which usually people think of wetware, like half
0:10:14 biological computers. Right. We wrote a simulator for the wetware as a kernel in NVIDIA, just like you
0:10:19 wrote a simulator for quantum. Yep. We wrote the simulator for neuromorphic on the hardware. And
0:10:24 that’s how we run an SNN. And so it’s actually fully custom to the NVIDIA chip. Obviously the answer
0:10:28 must be good, but how, how well does it perform? Really well, because the two problems that had to
0:10:34 solve were this. One, how do you compare apples and oranges? Right. Yeah. How is a Nielsen rating or
0:10:42 TV or a Spotify view the same as a phone visit or walking into a store, right? Different modalities,
0:10:47 different mediums, right? They all count as marketing, but exercises, touch points, something,
0:10:52 but yeah. Marketing has the worst job because their little job is everything. Right. Right. Second
0:10:56 reason it’s the worst job is everybody on earth thinks they’re a marketer. The second thing we had to
0:11:00 solve was marketing. You always hear about this. Oh yeah, we’ll do a campaign for a couple of weeks.
0:11:07 So you have no time history in our P ability, but how do you look for outliers with no time history?
0:11:12 So those two problems had to be solved. And SNNs were the solution for that. We turn everything into
0:11:17 spike signals, spike encodings. That’s all is the Apple’s. For people who don’t know one sentence
0:11:22 definition of what an SNN is. A spiking neural network is meant to be a digital twin of the human brain
0:11:27 to where it can actually process signals like neuron spikes, just like your brain would.
0:11:32 And what it’s doing is, is it seeing signal and then you’re firing neurons off a propensity.
0:11:36 There’s a lot more explanation to that. The reason that they’re really well-loved
0:11:41 is they’re incredibly fast and they are online as we call them. So that means that they are always that
0:11:46 and they’re evolving. So you don’t actually train them in advance. So they’re an evolving network.
0:11:52 So they really originally were really well-loved in people thought about that for IoT and they’re
0:12:01 fast. We use them to be able to find a way to detect outliers. So a human, a baby can pattern match from
0:12:07 the moment it’s born almost and teach it. So when you’re looking for outliers in data, normally what
0:12:11 we would do is we use prediction, grab all the previous data, see where the highest highs would be
0:12:16 predicted to be lowest lows and anything outside of that outlier. There’s other ways to do it too,
0:12:23 but for this audience, we’ll be overly redacted. With what we do is we’re actually looking at patterns
0:12:30 and actually seeing those changes from that neural morph, like the neural perspective in SNNs. And so
0:12:33 we can do outlier detections with no time history.
0:12:40 The reason this is important is say we actually, we presented at GTC, a case study with the CMO of Delta
0:12:44 and they sponsored the Olympics. And there’s a great recording of this. Actually, we did a session
0:12:49 and the problem is the Olympics happens for two weeks, three, two to four years.
0:12:57 If you do daily data, that’s 14 time steps. What do you do with that? So doing that signal processing,
0:13:02 first, we had to do that. The second thing is we actually had to figure out how to do the
0:13:07 connections, connect the chain of events. So say you watch the Olympics, then you Google for a Delta
0:13:11 flight, you click on a Google flights ad, and then you buy it. How do you connect those things
0:13:17 in time? So that’s where we ended up building all of our causal mathematics. So we use causal inference
0:13:21 and transfer information mathematics to be able to do that. So the first thing is we have to see the
0:13:27 signal at all. We build each time series with its own mini neural network. That neural network thing
0:13:30 gets chained together with causal links, and then we can build chain reactions.
0:13:33 Then you can mine the chain reactions for intelligence.
0:13:39 Okay. So let’s stay with that example then of Delta and the Olympics. A minute ago, you were talking
0:13:44 about, you know, with a client being able to ingest their data. And so I’m wondering, can you map like
0:13:48 these technologies you’re talking about in the spike neural networks? And you started talking about
0:13:53 causal AI, which is the other, other thing I wanted to ask you about. I need to hit it. What can you do
0:13:59 with the data? And how do these chain reactions? And, you know, how does it apply? What does the
0:14:02 client see? I don’t want to say at the end, but, you know.
0:14:06 You know, it’s funny. At the end, I actually believe that all dashboards should never exist.
0:14:12 Dashboards only exist to derive intelligence from them, right? Nobody wakes up in the morning and goes,
0:14:14 you know what I want in life? Another dashboard.
0:14:14 Yeah.
0:14:15 Yeah.
0:14:18 So what you do is you like look at the dashboard and then you write an email. So the way our
0:14:23 intelligence comes out is it actually comes out kind of like, uh, actually the bottle for it was
0:14:25 like the president of the U S gets an intelligence briefing every day.
0:14:30 Provide them as literal intelligence. Great. Yep. Oddly enough, we use LLMs at the company,
0:14:35 but not in the same way other people do. So there’s a game called Mad Libs. I’ll play with my kids.
0:14:40 It’s like you have a paragraph and then you have a blank and you fill in the blank and make a funny
0:14:45 sentence. That’s how we use LLMs. They never make a decision about data ever, but we do love them for
0:14:48 user experience. So they’ll write like kind of by the report, but then the cause of the chains will
0:14:54 actually just put in the actual data sets. And that means that, um, it kind of inoculates them
0:14:58 against the hallucination issue because everything that’s an actual real important data point. That’s
0:15:04 not like an is or an and or a connector is handled by a deeper method, right? A deep learning method.
0:15:08 And then we use the rest for user experience and communication. Oh, interesting. Those are derived
0:15:12 from the chains, but let’s take Delta as an example and talk about the Olympics. Sponsoring the Olympics
0:15:18 on national level is expensive. We’re talking eight figures at least. I think, uh, NBC reported
0:15:23 hundreds of millions, like maybe even a billion dollars in revenue off of the Olympics alone. I forget
0:15:29 what it was. I’d have to go look. It was massive. And when you’re doing that, there’s two types of
0:15:33 things that occur. One is you buy a whole bunch of 30 and 60 second ad spots, right? That you do.
0:15:38 The other is, is that you see this in sports all the time. You have things that are named after
0:15:43 companies. So if you watch the Olympics, you saw that the medal presentation ceremonies were Delta.
0:15:49 You know what sells a whole lot of tickets to Paris watching the Eiffel tower in the background in a
0:15:54 really emotional moment with a player as you put a gold medal on them. Yeah. That was actually more
0:15:59 effective than the ads themselves in some circumstances. Yeah, no, I, I can see it.
0:16:04 How do you connect that? Right. Somebody’s booking a flight. They’re doing some planning. There’s a delay
0:16:09 between those things. It’s not like you do it while you’re watching the TV. So you have to calculate the
0:16:14 known, we call it optimal lag, right? Between the time series, right? How do you calculate those
0:16:19 things? Right. And so what you do is you go, now we think about this like common sense. Of course I
0:16:23 saw that. And then someone’s going to buy something. It’s much harder to mathematically prove it.
0:16:28 You can’t even imagine. You know, we work with a finance organization where we actually look at
0:16:34 what causes ETF electronically traded fund flows. Conversions can be anything. You know what I mean?
0:16:38 You could be selling something or you could be trying to change the volume of something,
0:16:42 or you could actually care about more, the most open source contributors you get.
0:16:46 What we do is when we look at these chain reactions, we have nodes and edges as we call
0:16:49 them, right? Nodes are kind of the points like connect the dots. The edge is the connection
0:16:55 between them. We calculate every connection that could possibly exist as we build stuff.
0:16:55 Right.
0:17:01 So then we can dynamically search the intelligence afterwards. And so we can literally morph our
0:17:04 reports based off what the user wants to see. So if you’re like, I want to know all the
0:17:09 things that are about selling the ticket. And the next day you’re going to know, I want to know all
0:17:13 the things that are about game frequent flyers. All you do is change the focus. It redoes the search
0:17:18 in real time instantaneously. And so that type of stuff, what I find really interesting is a lot of
0:17:23 people are working in deep learning right now. The reason why we have such a sticky in these huge,
0:17:29 big enterprise customers is, and we’re not a cheap system, is that they derive real value out of it.
0:17:32 And also we understand that we have to meet them where they are.
0:17:34 How do you mean meet them where they are?
0:17:38 Well, I find that a lot of people are talking about AI and form factor. Should it be hardware?
0:17:43 Should it be software? How should it speak? Everything else. But I never hear anybody says,
0:17:49 is the thing that outputs actually useful? You know, I believe that actually we’re having a paradigm
0:17:54 shift in the entire thing. For the longest time, there were a ton of studies that said 90% of the world
0:18:01 was consumers. 9% was curators and 1% was creators. And this actually held true for pretty much like as
0:18:06 long as people could remember. But now suddenly with AI, the curators are actually creators.
0:18:07 Right.
0:18:14 So the 1% is now 10%. So you’ve 10x the footprint of people who are wanting to build things.
0:18:18 And the strategic monstrous shift that occurs with that is large.
0:18:23 The way you put that, it reminds me of, I mean, it reminds me of the 2000s and the web first,
0:18:27 you know, coming out and anybody could publish if you wanted to.
0:18:32 You know what reminds me of the most is actually when you could start recording your own record in
0:18:33 your living room.
0:18:34 Right.
0:18:38 It used to be, you had to have these ginormous studios. We see them in films, right? And everything
0:18:42 else. And then you have people like Beck and everything being like, I wrote a hit record on my
0:18:48 track. Yeah. And the democratization of that, I don’t think anyone would say music is worse off
0:18:52 because of it. The music industry, the monetization may have suffered, but the actual quality of the
0:18:57 music and the stuff coming out, I wouldn’t say that you’ve had a, I’d say we have access to more
0:18:59 independent music than we have ever had.
0:19:03 Oh, for sure. No, the difference between how my kids get music and how I did, you know,
0:19:05 couldn’t be, couldn’t be more different.
0:19:10 So like, kind of like what we talk about, like suddenly let’s take code in this example. We use a
0:19:16 lot of AI to write code now and this is everybody else, but code used to be, or is in a lot of
0:19:21 places like a bespoke thing, right? Like it will be the most beautiful furniture ever. And somebody’s
0:19:28 hand crafting this beautiful object or before the assembly line existed for Ford, right? Any motor
0:19:33 vehicle would be like handcrafted by every single little individual thing. But now it’s no longer
0:19:39 bespoke. Now you have this sense of like, you actually have mass crafted, mass generated objects
0:19:44 in the beginning. I’m sure mass generated cars did not equal the bespoke cars. Of course.
0:19:48 Nowadays, I would say probably most of the mass generated cars are probably better than this
0:19:52 bespoke cars from a safety perspective. Code is doing the same thing. And so now you have two
0:19:58 fronts. You have everything’s able to be like across this information. And the second thing is
0:20:04 the curatorial class, right? Those with taste are now creators by default, right?
0:20:11 The playlist is a Veblen good something, right? Like, um, now I love tools that give people agency to
0:20:19 control themselves. I am deeply worried long-term about access because that can be hard, but I think
0:20:23 about the number of musicians who never would have had anyone to hear them at all. If a four track
0:20:28 didn’t exist. Yeah. I think about the number of writers or filmmakers or brilliant people who are never
0:20:33 going to be heard at all, if they did not have the democratization of tools. When we bring this back
0:20:38 around to marketing, attention, everybody’s attention is really what there is, is there’s only, it’s a
0:20:43 supply side limited. There’s only so much attention in the world. There’s only like for so long, every
0:20:47 company in the entire world is competing for the same slice of attention at the end of the day.
0:20:52 The only commodity that can never be increased is a person’s heartbeats. How many till the day you die,
0:20:56 then you can’t sacrifice. So when you’re doing that marketing and everything else, and we’re trying to do
0:21:00 that, we’re trying to understand the universe and get down to it. Now we’ve got a little like
0:21:04 theoretical here, but I think it’s an important paradigm shift that we’re discussing because
0:21:10 you can’t separate at the end of the day marketing from the rest of the businesses it governs.
0:21:17 So, so bring it back down sort of to, you know, business us and bolts level. How is AI, I mean,
0:21:23 as much as you can, and you know, be specific because it’s so many ways, but what are some of the biggest
0:21:29 ways you see me is better way to ask it that AI is already changing how brands relate to their, their audiences
0:21:36 and, you know, what’s happening kind of just down the road. I think this idea of, and it’s funny, as I was
0:21:42 prepping to talk to you, I was thinking about, we’ve done a lot of episodes on health and medicine lately, and this
0:21:50 idea of precision medicine, right? And being able to offer, you know, each individual, the individualized care,
0:21:57 preventative care, you know, it’s technology enabling us to provide that level of personalization.
0:22:04 And so there’s a similar thing in some ways, as I understand it with marketing, and I mean, to really
0:22:08 probably not do it justice, but I break it down in my head to cookies following me around.
0:22:14 And that being kind of a, you know, to me seems like a brute force, maybe outdated at this point
0:22:19 way of doing it, but what’s AI doing for all this? And what’s it going to be doing for, for brands and
0:22:21 engagement and personalization?
0:22:28 You take a little bit of an anti stance here. Human beings love building tribes and community and the
0:22:34 most visceral experiences that they have. Let’s take sports. People will often say sports is the new
0:22:39 church. It’s not my phrase. It’s just what everybody, you know, you want that community, whether that is
0:22:45 in that format or it’s in the sports format or something else, that means that you need everybody
0:22:52 to experience the same thing and be able to respond to it. And if you think about the fondest experiences
0:22:58 you can remember, almost, almost always are there with somebody else, right? Remember doing thing with
0:23:03 somebody or how it affected somebody or how it did something. And so while I believe that personalization
0:23:11 is interesting, I actually believe that being able to really build these experiences that people have
0:23:16 and to be able to bring people closer together, those are the marketers that are going to win.
0:23:17 Okay.
0:23:22 Mathematically, I talk about this a lot where you’re like, what does an LLM mostly do? It predicts the
0:23:30 next best token. Inherently, that next best token is the one you would expect. That’s what’s
0:23:35 supposed to do. That’s right. Kind of in a weird way, like kind of like that’s kind of like it’s
0:23:39 going to give you the median, right? It’s going to give you the answer you should have. But what
0:23:43 great artist, what great marketer wakes up in the morning and goes, you know what I want to be? The
0:23:49 median. And I don’t know, I can really feel, it’s gotten better, right? But I can still feel the
0:23:55 aesthetic of AI imagery and everything that comes out because it is still trying to, it is almost going
0:24:01 to have its own aesthetic. And so I think that what will happen is, is that the uniqueness of
0:24:07 experiences and everything else and the premium of that will rise. And that we will have this kind
0:24:11 of push against where it’s not like all about personalization forever. Everybody’s going to
0:24:14 realize you still are going to want to experience that with other stuff.
0:24:15 Interesting. Yeah.
0:24:19 And so for maybe 10% of your life, you’re going to want to hyper-personalize,
0:24:22 like I want to talk one-on-one with it, right? Or maybe it’s an ephemeral thing. You’re
0:24:29 using, I heard a great founder talk recently about how he uses ChatGPT and he writes a story
0:24:34 with his kid at bedtime and he just doesn’t keep it. It’s audio. They’re just playing with it. That’s
0:24:39 ephemeral. But you don’t go to the water cooler the next day or get online and talk on Discord with
0:24:45 your buddy about that. And so I feel like often human nature is lost within this. So marketers,
0:24:50 our job is to create those experiences. And then when it’s a touch point for preference that you’ve
0:24:56 already created, then try to enhance it with some personalization. But I feel like people have lost
0:24:57 the plot a bit.
0:25:02 Yeah. Interesting. I don’t disagree. I just, it’s interesting to think about that in the context of,
0:25:08 as I said, all the personalization that jumps to mind when thinking about these types of things
0:25:10 or the potential personalization.
0:25:13 Well, I always think about personalization is once you’ve engaged with the brand,
0:25:16 how does the brand find the optimal path for you?
0:25:16 Yeah.
0:25:17 Deeply important.
0:25:18 Yeah.
0:25:24 If I’m already an Amazon customer, I need the happy path. If I am an Apple customer,
0:25:30 what iPhone do I buy? If I’m, I have some friends who love Alice and Olivia dresses. They’ve got a ton
0:25:33 of different dresses. Which one should they get? Right? They’re going to have preferences within that.
0:25:36 But I’m talking about like, how do you even get them to like the designer?
0:25:39 Right. You have that shared moment. Yeah.
0:25:46 Profit and the margin comes from brand, but what is the value of brand? How do you actually price that?
0:25:51 All right. So I’m going to flip this on you and it’s not even flipping. I was, I was thinking of
0:25:56 an excuse to, to play off of what you just said. How do we value the human in all this going forward
0:26:03 to go to the other side of the process and think about the creator, the, the ad man and woman,
0:26:08 if you will, whoever’s on, you know, the side of, is it curating? What’s, what’s the role for the
0:26:16 human for human creativity? How does that evolve as people in marketing, people in media, whatever part
0:26:20 of the process, you know, you’re working on, you’re, you’re overseeing, you’re collaborating on, like,
0:26:25 how does that role change? And, and years ago, and it’s crazy that I can say years ago, referring to
0:26:30 this podcast, but we can now, uh, we had somebody on, I don’t want to misrepresent his title, but a
0:26:35 creative director basically working in the game, video games industry and talking about how generative AI,
0:26:40 AI, I think the title of the podcast, something about making zombie armies with Gen AI, something
0:26:45 to that effect and talking about how Gen AI was enabling this, you know, and, and it’s this metaphor
0:26:51 that we keep using. Think of your co-pilot as like an intern, right? A highly capable intern who needs
0:26:56 a lot of direction, instruction, learning the ropes, but they can generate good work if, you know,
0:27:01 correctly prompted, if you will. But he was talking about the ability for one person to kind of give
0:27:08 creative director level instructions and have the AI, you know, fill in the rest of the zombie army
0:27:14 from a couple of examples or generate landscape or, you know, that kind of a thing. And so when you
0:27:15 said, you know, curators.
0:27:16 No, there’s a guy named Andy Warhol.
0:27:18 Right? I’ve heard of him, yeah.
0:27:21 Right? Andy Warhol did not paint all his pieces. How’s it different?
0:27:25 Well, I don’t know how to draw the metaphor, but, because I’m sure Andy Warhol had lots of people
0:27:31 working for him as well. But what happens to the human when, if I can generate, you know,
0:27:37 the work of five people in a creative setting, does the role become for, instead of being a
0:27:46 writer who learned how to manage creative teams, I’m just more of a, you know, creative team manager
0:27:48 from the get-go and some of my team is AI or?
0:27:53 I think sometimes we get so theoretical about this stuff. You know, I had a terrible punk band
0:27:55 as a teenager. We were awful.
0:27:57 Nice. What are you called?
0:28:01 Then there were three. We had four guys, and then we ended up with three of them. And so
0:28:02 we just called the band, and then there were three.
0:28:03 You’re a Genesis fan?
0:28:04 Oh, yeah, yeah.
0:28:05 Okay.
0:28:06 Yeah, it’s the whole joke.
0:28:09 That’s like, there’s no way, there’s no way you’re not.
0:28:14 I know. So the funny thing about all this is, I think back to then, right? If I try to put
0:28:19 myself in my young shoes, right? 15. All I want to do is make cool stuff.
0:28:20 Totally.
0:28:24 Ezra Klein gives a great talk about this, where he says, for those that become creatives,
0:28:29 like, all you have in the early days is taste. And the hardest part about taste is when you do
0:28:36 something, you suck. But you know you suck in the early days. And so you have to fight through all of
0:28:45 the hundreds and thousands and awful times to then generate a style and a body of work. And I think
0:28:53 about how incredibly demotivating awful that was at the time. And what it will do is for the people
0:28:59 starting out, it will create great agency. And they will be able to do incredibly cool things. And they
0:29:04 will get 80% of the way there. But I also think it’s going to create this weird, almost even harder
0:29:07 field to get through, where you’re going to be stuck there way longer.
0:29:13 At that sort of glass ceiling of understanding that there’s better work out there. I just don’t know
0:29:13 how to create it.
0:29:16 Yeah. Because I mean, you’re not failing as much.
0:29:17 Right.
0:29:23 Right. And so for me, the interesting conversation about this the most is what happens to us as humans
0:29:26 when we don’t fail enough to improve?
0:29:27 Right.
0:29:34 Because failure is what makes us improve. And man, I do things the hard way so many times.
0:29:34 Yeah.
0:29:39 Right. Even today, like, I’ll do things where I’m like, did I really do that? But I have a son who’s
0:29:44 seven. And I’ve watched him, like, struggle and get through and learn things and build them. We’re
0:29:46 building Gundam together now. There’s Japanese robots.
0:29:46 Oh, nice. Yeah.
0:29:50 First one he built, he didn’t get it. He accidentally chopped off a part, right? Now he can build one by
0:29:51 himself. He’s seven.
0:29:52 Amazing.
0:29:55 And so what happens when failure ceases to exist?
0:30:02 Does the ability to so quickly generate so many more takes, do you think that desensitizes us to
0:30:02 failing?
0:30:05 I think it means that you’re not going to learn from the failures as much. You’ll learn different
0:30:08 things, right? You’ll learn how to manipulate the AI. We’ll become experts in AI manipulation.
0:30:13 It’s just an interesting thing. Do those small failures where they’re not really failures count?
0:30:17 Will it have the same effect? I actually don’t know the answer to this. I just think it’s
0:30:21 the most interesting component of what you mentioned from that basis.
0:30:26 Right. If I can take, I was, it’s showing my age, but my mind for some reason jumped to
0:30:34 digital photography. And, you know, that move from every photo I take literally costs me,
0:30:40 you know, a roll of film divided by X exposures to, and worse yet than the monetary cost is once
0:30:45 my rolls out. You know, I got the rest of my day in Paris and I can’t take pictures to, you know,
0:30:50 phone in my pocket. It’s all digital. Take a whole bunch, pick the best one, you know,
0:30:51 delete the others.
0:30:56 Yeah. And I mean, you can see the difference it had on, you can absolutely see the difference it had
0:31:03 on what photography looks like. Now, you know, you go from Ansel Adams, hyper-composed, or these
0:31:09 beautiful New York style Vogue shoots, right? Where lighting was king and we end up in more
0:31:13 of a street photography style, much more naturalistic style. I don’t even know if one’s worse than the
0:31:14 other, right? I actually like both.
0:31:14 Yeah.
0:31:17 But it absolutely will have an effect on the world.
0:31:17 Yeah.
0:31:23 And actually, I think that what did do though, is I think it is much harder to make a living as a
0:31:23 photographer.
0:31:24 Right.
0:31:29 You know, and I don’t know what the answer is here, you know, Andreessen, or what was it? Andreessen
0:31:32 or Horowitz, I can’t remember, one of the two of them used to give a talk where they were like,
0:31:38 innovation is inherently destructive often, right? Records got rid of in-home musicians.
0:31:38 Right.
0:31:45 In symphonics, washing machines displaced, house workers, stuff like that. But in the creative
0:31:49 side of the space, I’ve never seen humans be less creative just because they’re not paid for it.
0:31:52 We’re talking about how people are getting paid less to be a photographer and everything. We’re
0:31:53 not seeing less photographers.
0:31:53 Right.
0:31:56 This is what’s hard about subjects like this. We’re talking about something that’s both
0:31:58 business and art.
0:31:59 Exactly. Yeah.
0:32:04 But let’s kind of bring it back to like, kind of a couple things. Just Alembic and marketing and stuff.
0:32:05 Yeah. Yeah. Okay. Cool.
0:32:10 So say you have all the spiky neural networks, right? Say you have the causal, you can build
0:32:15 the chain reactions. The key with what you have to do then, which I think kind of brings
0:32:18 it into this, is you have to make it understandable to a human.
0:32:18 Of course.
0:32:18 Right.
0:32:23 And you have to be able to take action on it. And we talk about this in my company a lot. Every
0:32:28 human has their own superpowers and specialties. I don’t expect everybody to be great at data and
0:32:32 math. I expect some people to be incredible at EQ and be able to keep the office together.
0:32:32 Right.
0:32:33 Yeah.
0:32:33 Stuff like that.
0:32:34 Shout out to those people.
0:32:39 Seriously. When we’re doing this, you have to have the system meet people where they’re at.
0:32:41 And that’s where that last mile comes in.
0:32:41 Yeah.
0:32:45 We take the data and instead of being like, I want to find a needle in a haystack, like
0:32:49 your security company being like, I need to find the one bad actor. We have to surface all
0:32:55 the data, have it make sense and let it help them accomplish their goal. And what that means
0:33:00 we have to do monstrous. We have to refine this, like your deep learning, this corpus of monstrous
0:33:04 amounts of disparate data. That’s pseudo structured at best.
0:33:11 And be able to pull insights out of it. And with those insights, then be able to let people
0:33:16 meet the goals of their group. And I think that that in general with marketing is when you think
0:33:20 about it, you’re being like, if I can see everything, if I can see all the creative stuff I’m doing,
0:33:25 even the cool stuff, like I did a cool activation app pop up in New York and I can treat that the
0:33:26 same as I did a Google ad.
0:33:26 Right.
0:33:27 Right.
0:33:31 Right now one’s ignored. Those ones that will work, those cool pops that work, we’ll get credit
0:33:35 and we will get more cool things. Right. Whereas otherwise we’re just going to get
0:33:40 more just a whatever, right? Just some random, just simple thing. You like buy this thing.
0:33:46 When we talk about like creatives and business and everything, great information is always good
0:33:51 for buyer and seller. And so quality information is a plus because if companies see, and I’ve talked
0:33:55 to every CMO on earth, right? Chief marketing officer on earth. They said, if companies see that somebody
0:34:00 who’s creative is making me $2 for every dollar I spend, I’m going to spend as many dollars
0:34:05 with that human as possible. Whereas if you go, well, yeah, we did that cool pop up, but
0:34:09 I can’t see what that creator did. Right. Yeah. Why’d you pay the premium? Right. We’re going
0:34:14 to spend on vibes. Yeah. I think that like AI and everything, as we talk about nowadays,
0:34:20 can be wonderful and actionable and push ahead and provide real value for people where you could
0:34:25 be like, I know for every $12 I spend here, I should sponsor the NBA. I should sponsor the
0:34:29 WNBA because it makes me money. Right. Actually, I should be paying the WNBA more money because
0:34:33 then they can do more for me and then do stuff. Right. It creates quality across the data sets
0:34:37 because the things that get ignored are the outliers or the smaller things because they
0:34:42 can’t pick up enough signal. Right. We want to get all the signal. Yeah. Do it. And then
0:34:47 you can distribute correctly. Makes a lot of sense. So we’ve been ending the shows recently asking
0:34:52 the guests question, but I kind of want to ask you a variant of it. Question is what tools,
0:34:57 what AI tools are you using lately that, you know, you really like or might recommend. But I think
0:35:03 in this case, I mean, feel free, but also for somebody who’s out there who’s whatever, whatever
0:35:09 age of their life, but they’re newish to marketing or maybe they’ve been in marketing for a while,
0:35:16 but they’re new to getting a handle on AI and Gen AI and like how to actually start using it in a way
0:35:20 that, you know, can be constructive to the work they’re doing, their career development, that kind
0:35:25 of thing. Tools that you’d suggest, a book that you’d suggest, some resource you might suggest
0:35:32 to somebody out there who wants to, you know, get hands and brain on with AI and marketing right now.
0:35:37 All right. I’ll give two versions. I’ll give the, um, I’m just getting into it and I’ll give the,
0:35:39 I’m the giant advanced nerd. Fantastic.
0:35:46 So for beginners, what I always recommend for any of this stuff is actually, you literally can just
0:35:52 start with chat GPT. Sure. What I will say about the best of this is plan before you create. And
0:35:58 the number one thing that people forget to do is be like, tell the system, you may ask clarifying
0:36:04 questions. Say, I want to plan, say anything you like and be like, please ask clarifying questions.
0:36:07 Yeah. And then we’ll just start talking back to you and you two will make the best plan on earth
0:36:12 and act. You can do this on how to learn. You could do this on how to do stuff, anything.
0:36:18 And I think people forget that, um, they want a declarative statement, but it can be a conversation.
0:36:20 Right. That’s good. That’s good advice. Yeah.
0:36:26 And, uh, it’s the number one thing I teach myself. The second thing is, is, and this sounds really
0:36:30 silly for beginners. If you’re doing a long conversation, like a big, long prompt, put the
0:36:36 instructions at the very top and at the very bottom. Yeah. Yeah. Absolutely. When things long
0:36:40 enough, you need to put in both places. There’s going to lost studies on this. Those two things
0:36:44 are probably the number two things that can get you 80% of the way there for improvement and just
0:36:50 explore. The second thing is find voices you like, right? Whether that’s Scott Bollinger, whether that’s
0:36:56 Gary V, whether that’s anything, and you can just use those voices to offset, right? You do want to talk
0:37:02 to actual humans for the super advanced folks. One thing I’ll recommend is choose your top 10 data
0:37:08 scientists, whether it’s Lacoon, whether it’s whatever, and tweak your algorithms, go to LinkedIn
0:37:13 and like the last 10 of their posts that are only technical, go feed and train the thing to give you
0:37:18 everything you want. I have mine trained to only give me, you know, papers from the archives and like
0:37:23 cool algorithms like that. And I actually find it very convenient. Yeah, I bet. And so I know people get
0:37:26 really frustrated with that type of stuff, but it’s the absolute best way to get the largest footprint
0:37:34 quickly. The second thing I will recommend is that please go look at other disciplines. People get
0:37:38 very myopic. We’re starting to see people do diffusion models right now for LLMs when we were
0:37:43 doing all transformer. I am never precious about where I get methodology from. I’m only precious about
0:37:49 what the result is. Yeah. And so I think that the breadth of learning, the curiosity is the number one
0:37:55 thing I feel like people are doing very much. This is my team lately. My LLM that I like to use. This
0:38:01 is my algorithm. This thing. And I’m like, it’s all hammers and nails, guys. Right. Go build something.
0:38:07 Yep. And so, you know, Ben Franklin, when he discovered the form of electricity, wasn’t trying
0:38:11 to discover electricity. He was trying to invent the lightning rod so that all the houses wouldn’t burn
0:38:18 down. So, like, go explore. Those are the two things I would say. Excellent. Tomás Puig, Alembic. For
0:38:23 listeners who would like to learn more, website, where can they go online to learn more about Alembic,
0:38:28 the work you’re doing? Getalembic.com, right? Yeah. Nice little corporate-y site there. We are very
0:38:33 friendly. And then also, we are often at several of the industry conferences you’ll get at, Gardner,
0:38:39 NVIDIA, Forrester, that type of stuff. Always feel free to say hi to us. Fantastic. Tomás,
0:38:43 thank you so much for your time. It’s been a, really, it’s been a pleasure talking to you. It’s
0:38:47 fun to kind of get into the theoretical. Yeah, yeah. If you’re ever up in a city, drop by the office or
0:38:51 wherever and say hello. I’d love to chat with you more. It was just an interesting conversation.
0:39:34 Thank you.
0:39:41 Thank you.
Tomás Puig, founder and CEO of Alembic, joins the NVIDIA AI Podcast to discuss the intersection of AI, data, and marketing. He shares how Alembic uses advanced mathematics and AI—particularly spiking neural networks and causal inference—to help brands extract actionable insights from massive, anonymized datasets. The discussion also touches on the evolving role of human creativity in an AI-driven world, and the importance of private data as a competitive advantage. Learn more at ai-podcast.nvidia.com.


