Amperity Reimagines Data and Developer Workflows with AI

AI transcript
0:00:16 Hello, and welcome to the NVIDIA AI podcast. I’m your host, Noah Kravitz. From agentic
0:00:21 AI to vibe coding, our guest has been at the forefront of showing how AI-powered systems
0:00:26 can empower engineers while delivering measurable business value. Derek Slager is co-founder
0:00:32 and chief technology officer of Amparity, a company that’s redefining how enterprises use data
0:00:37 to better understand and serve their customers. Derek’s here to talk about his journey founding
0:00:42 Amparity, the tools his team is building, like Chuck Data, an AI agent for data engineers,
0:00:48 and his perspective on how AI is reshaping not just the enterprise landscape, but the developer
0:00:54 experience itself. So let’s get to it. Welcome, Derek, and thanks so much for joining the NVIDIA
0:00:56 AI podcast. Thanks, Noah. Great to be here.
0:01:01 Great to have you. So let’s start at the beginning. Tell us a little bit about Amparity.
0:01:07 Yeah, so Amparity is an AI-powered customer data cloud, and we help brands unify, understand,
0:01:13 and activate customer data at scale, which is a lot of things. And, you know, we’re really
0:01:20 particularly focused on data quality, right? Because we’re big believers that better data
0:01:24 equals better results, right? Right. You’re funneling that through agentic use cases or
0:01:31 otherwise. And so we have a lot of great capabilities to help people all over a consumer business take
0:01:37 advantage of that data, but it’s all about getting the data right. So what inspired you to co-found the
0:01:42 company? And how does your background, your own journey as an engineer kind of shape the direction
0:01:49 of Amparity? Yeah, for sure. So we started in 2016, right? Like, which, which, you know, I sometimes
0:01:54 refer to as like the false start AI era. You know, there was a lot of, you know, a lot of excitement
0:02:00 about kind of, you know, deep neural networks and a lot of other kind of innovation happening in the AI
0:02:05 space, but certainly it was, you know, nothing in comparison to the current AI wave, but nonetheless,
0:02:10 right? Like that was kind of, you know, it was on a lot of people’s minds and, and, you know, what we
0:02:18 observed in kind of researching Amparity was almost every consumer brand had a project to unify all
0:02:24 their customer data. And yet we couldn’t find a single one and we tried pretty hard. We couldn’t
0:02:29 find a single one that said, yeah, yeah, we solved it, right? Despite all that effort. And, and we just
0:02:35 found that ridiculous. And so, you know, what inspired us to start Amparity was really the
0:02:39 opportunity to help all these people who were trying and failing to solve a problem that was really
0:02:45 important to their business actually succeed in doing that. And, and of course, like the, the,
0:02:49 the key point there was, you know, all these people were smart. They were trying really hard to solve
0:02:54 it, right? So, so clearly the existing tools were insufficient because they were sufficient. People
0:02:59 would be achieving more success. And so, you know, the big idea of Amparity was really,
0:03:05 how do we bring AI to this problem? And we felt like, again, this is through a 2016 lens. We felt
0:03:09 like AI was sort of a perfect fit because a lot of the reasons that people were struggling with this
0:03:14 problem was that they were trying to make the data perfect, right? And, you know, AI kind of lives in
0:03:18 that space where, you know, maybe it doesn’t give you a perfect answer, but it gives you a really high
0:03:23 quality answer. And when we applied it to that problem, it worked really well. And so, you know,
0:03:28 we’ve kind of carried that forward, you know, into the, into the modern AI era. And, you know,
0:03:31 that’s allowed us to kind of, you know, bring acceleration to a whole bunch of other things,
0:03:33 which I’m sure we’ll talk a bunch about today.
0:03:38 Maybe before we dive in a little deeper, can you give us kind of a high level overview of some of
0:03:42 Amparity’s offerings, product services, how you work with customers?
0:03:48 Yeah. So, so we have, you know, kind of a core capability around helping users
0:03:54 organize and unify data. So a big component of that and kind of our initial R&D innovation was
0:03:59 around stitching. And that’s about kind of finding all the Noah Kravitz’s, you know, all over these
0:04:05 different, you know, data sources. A typical Amparity customer will have 25, maybe 30 different,
0:04:11 you know, data inputs that contain information about customers. And so, you know, we’re looking for all the,
0:04:16 all the Noah Kravitz needles across those various haystacks. So it’s, it’s a, you know,
0:04:21 it’s a challenging at scale problem. And so on the other side of that, then we can, we can sort of
0:04:27 utilize the outputs of Stitch to create a really high quality customer 360. And that allows people to
0:04:32 kind of, you know, ask questions and understand their data better. And so we have a bunch of tools,
0:04:37 you know, that allow people to, you know, kind of introspect and slice and dice. And, you know,
0:04:43 many of those are, you know, powered, you know, in, in 2025 by modern conversational interfaces,
0:04:46 which is really exciting because it empowers many different people from the organization.
0:04:52 And then we have a lot of capabilities to take the data and take the kind of different segments of
0:04:57 that data and then get it out into the ecosystem, right? There’s thousands of marketing tools and
0:05:02 thousands of customer support tools and thousands of all these, you know, kind of tools that,
0:05:08 all are hungry for better data. So Imperity is really focused on getting that data right and then
0:05:13 activating that ecosystem. And we have a number of capabilities, you know, from kind of campaigns
0:05:20 with A-B testing to journeys and, and many other kind of, you know, pieces of, of, of capability to
0:05:22 sort of allow that to kind of integrate with the ecosystem.
0:05:27 Right. Fantastic. One of the things, one of the, I don’t know if you’d call it a tool per se,
0:05:33 but one of the things that’s getting a lot of attention currently in the world of AI is AI agents,
0:05:38 agentic systems. We’ve talked a lot about it on the podcast, talked about it from the NVIDIA
0:05:44 perspective of building the blocks that allow folks like you and Imperity to build the tools to serve
0:05:49 your customers. What does agentic AI look like from your point of view when you’re out working in real
0:05:54 life business situations and developing applications? What does agentic AI mean right now?
0:05:59 Yeah. Put you right on the spot. Yeah. How many definitions of agentic AI? And, and, and I’ll
0:06:05 admit sometimes, you know, uh, I’m a little flexible with my definition. Sure. Sure. I can, I can rephrase
0:06:10 and say, how do you, how do you approach it? What do you think about what’s the lens? Yeah, sure. So,
0:06:14 you know, I’ll start with my own definition and, and, and I’ve, I’ve heard many definitions that are
0:06:18 very complex. You know, my own definition is, is, is very simple and I think pretty expansive to the
0:06:24 definition of agentic, which is, it’s just a program where the LLM can impact control flow. And
0:06:28 so, you know, impacting control flow might be retrying, it might be calling a tool, it might
0:06:32 be interacting with another agent. And so, you know, it’s pretty, it’s pretty open-ended in terms
0:06:36 of what could fit that definition of agentic. Uh, the reason I use that as the definition is because,
0:06:41 you know, when the, when the LLM is dictating control flow, that’s very different from a traditional
0:06:46 program, right? A traditional program that makes a call to an LLM is still a traditional program and
0:06:51 your eval metrics and other things look awfully familiar, you know, relative to, um, you know,
0:06:55 some other things that you would do, but once you kind of, you know, in essence, let the algorithm
0:06:59 take the wheel, uh, things change a lot. Right. And, and it changes kind of how you build systems,
0:07:04 how you monitor systems, how you evaluate systems. And more importantly, it changes what those systems
0:07:10 can do, uh, you know, for your business. And so, you know, obviously we’re working a lot with that
0:07:15 technology, but, but also I have the opportunity to talk to a lot of customers who are, you know,
0:07:19 working with that, you know, technology in their own organizations in various ways, uh, oftentimes
0:07:25 using, uh, you know, kind of imparity data assets, but we see use cases. I think customer support is
0:07:29 one of the biggest use cases. And I think, you know, you know, one of the interesting things is,
0:07:33 you know, a lot of people think, yeah, of course, like, you know, agentic for customer support,
0:07:37 that makes sense as a cost cutting exercise, but that’s terrible customer experience. I think what,
0:07:44 what I’ve found talking to customers about it is their customers like it. And so even when they’re
0:07:48 kind of fully eyes wide open, Hey, I’m interacting with an agent. There’s just, it actually yields a
0:07:52 good customer experience. So, you know, I think that’s one I’m seeing a lot, you know, analytics,
0:07:57 introspection, kind of understanding, uh, the business. There’s a lot of agentic use cases that
0:08:03 I see there. And many customers are experimenting with bringing agentic use cases to the end consumer.
0:08:07 And I don’t see quite as many of those, but ultimately I think that’s, you know,
0:08:11 what we’re going to start to see a lot more of, uh, as, as time goes on, I think, you know, I’m
0:08:17 seeing, I’m seeing a lot of those kind of up close and personal in prototype stage. And a lot of them
0:08:21 are a little bit stuck there at the moment as they kind of work through that kind of, you know, long tail
0:08:26 of, of, of challenges. But, but I think over the next year we’ll see a big increase and that becoming
0:08:29 a prominent component of customer experience for consumer brands.
0:08:36 Right. How do your clients, the users you work with, how do they feel about letting the LLM take the wheel?
0:08:42 Uh, is there, are, are, I’m sure there’s a mix of things, but what’s the sense you’re getting and how are our
0:08:42 companies adapting?
0:08:49 It’s a really interesting question. I would have given you a boring answer six months ago, you know, but like,
0:08:55 it’s amazing how much it’s changed in six months. And I’ll start by saying like, uh, there’s huge
0:09:00 variants, right. And, and, and I don’t always know, right. It, it just kind of depends probably the, the
0:09:05 major variable there is kind of, uh, you know, how much AI experimentation, the leadership team has done,
0:09:09 you know, because I think, I think once they get it, it flows down really, really quick.
0:09:14 But, but yeah, I’ll have conversations with people who are like, yeah, we want, we want everything in
0:09:18 our organization to work this way yesterday. How fast can we get there? And I work with other people
0:09:23 who are like, yeah, we’re still kicking the tires. Right. And so, you know, it’s, it’s incredibly,
0:09:28 you know, diverse the, the attitude, but boy, the trend line is clear. Right. And, and, and the,
0:09:34 in the last six months, it’s, it’s amazing how many people have gone from kicking the tires to all in.
0:09:39 Yeah. Uh, and I think after another six months passes, I think we’re going to see probably,
0:09:45 you know, 80% of people are going to be in the all in mode. It feels like time has sped up over the
0:09:51 past few years. Yeah. Like extremely. Yeah. Yeah. Can you share a story of delivering an agentic based,
0:09:57 an agent based experience application for a large client, a fortune 500 clients, something like that.
0:10:03 Um, and kind of tell us about how it went, if there was a challenge or kind of a surprise to overcome and,
0:10:07 and, you know, how, how the customer, um, responded.
0:10:11 Yeah, absolutely. So, you know, it’s interesting. I’ll, I’ll maybe give an imparity centric story
0:10:18 since, you know, those are the ones I’m closest to. We started like a lot of companies are, you know,
0:10:23 LLM in the product journey with text to SQL, you know, and, and, and that was kind of, you know,
0:10:28 in a part of the product surface that was, you know, for people who already knew SQL and it was,
0:10:32 you know, kind of a nice augment of the experience. But then we challenged ourselves. We said,
0:10:37 Hey, what if we built something that was for people who don’t know SQL, right? People who
0:10:41 want to ask questions of their data and, and, and better understand and make decisions about their
0:10:46 data, but, but don’t have that skillset. Like what could we build for them? And so, you know,
0:10:50 we built a capability called AMP AI and, and to the, to the, to the point about a curveball,
0:10:55 I’m going to admit, I was a little skeptical because what I thought was we’re going to build
0:10:59 the surface. People would try it a little bit and then be like, well, I’m not sure if the data’s right.
0:11:03 So I’m just going to go back to what I did before, which is, you know, ask one of these SQL people to
0:11:08 give me the answer. And, and I suppose I could say I was pleasantly surprised because what we found is
0:11:14 when we, when we gave people AMP AI, there was a lot more energy for people to kind of really get it and
0:11:20 introspect the data than I had thought. Right. And it was surprising to me how much that kind of SQL
0:11:25 interface between the user’s question and the answer was a barrier. And so, you know, when we created the
0:11:31 experience that was for them, I was amazed at how many people used it and not just used it a couple
0:11:37 of times to try it, but like, you know, we, we, we did a cohort analysis and we looked at, at usage
0:11:43 over time and, you know, people who use the product, you know, a little bit once they started kind of
0:11:48 using the AI interface went up to using it a lot and stayed there. And so it was really fascinating to
0:11:53 me to see, you know, a whole bunch of people without a very technical skillset, all of a sudden
0:11:58 become more data informed. Right. And all we had to do was put this, you know, little surface on top of
0:12:03 the data. It was, it was really, you know, kind of a, a great surprise and an exciting surprise. And
0:12:08 certainly has given us the confidence to, you know, lean more into these AI use cases.
0:12:12 Yeah, no, that’s, that’s what you want to see, right? You get that adoption and it sticks.
0:12:20 Yeah. Along those lines of opening up more technical capabilities, possibilities for the less
0:12:26 technical people. Uh, let’s talk about vibe coding. Yeah. Maybe nine months, 12 months ago,
0:12:30 whenever it was, I saw a video and forgive me, I can’t remember the name of the company,
0:12:36 uh, where somebody, I think they were doing it on a mobile phone said to the phone, create an app that
0:12:42 does X, Y, Z. And we watched as on the screen, it spit out the code and ran the app. And it was,
0:12:48 oh my goodness, like what’s going to happen to the world. Now we fast forward. And the other day I
0:12:54 asked a coding tool like this to recreate one of the arcade games I grew up with. And, you know,
0:12:58 as you alluded to before, the results might not have been perfect, but it sure wrote a whole lot
0:13:04 of code that I couldn’t have written myself. This is amazing. It opens up possibilities. And this is
0:13:09 more from what I hear from people who are more versed in coding and security and things like that than
0:13:16 myself. It also opens up some potential issues. Talk to us about vibe coding. What does it mean to you?
0:13:22 What does it mean for, you know, engineers, people who are actually steeped in what they’re doing?
0:13:25 And how is this, you know, changing the landscape now and going forward?
0:13:30 Yeah. So, you know, vibe coding, I love that it has a name now.
0:13:35 Right. And I didn’t even explain what it meant. I just assumed at this point.
0:13:39 Oh, yeah. Yeah. I think if they’re listening to the NVIDIA AI podcast, they’re up on things.
0:13:40 Right.
0:13:46 Yeah. I think that makes sense. Yeah. I think vibe coding has, you know, impacted everybody. And you
0:13:51 gave the examples of kind of these, you know, increasingly impressive one shot examples where
0:13:54 you give a single prompt and something pretty amazing comes out the other side. Right.
0:13:59 That’s maybe one category of vibe coding. And then I think like vibe coding for the, you know,
0:14:04 professional software developer set looks very, very different. Right. And I think part of that
0:14:10 is, you know, people are working with very large code bases and, you know, just the properties of
0:14:14 the system and what you get out are very different than the big one shot thing where it’s generating lots
0:14:20 of code and things need to need to integrate with an existing workflow. I think what’s very interesting
0:14:26 is vibe coding changes the workflow of programming. And I think this is something that, you know,
0:14:33 I’ve really seen the people who in particular get tons of value out of vibe coding are not just doing
0:14:37 what they’ve always done, but doing a little bit faster because they’re involving an LLM. They’ve kind
0:14:43 of re-architected how they write code. And specifically, a lot of times they’re taking advantage of
0:14:51 asynchronicity. So, you know, some engineers will launch 10 different, you know, LLM based processes
0:14:56 at the same time. And then you may go off and do something that frankly, if you were sitting there
0:15:00 watching them, them code would look exactly like what they’ve always done. And then, you know, at the end
0:15:05 of that session, they’re going to go back in and check on the results of those agents. And, you know,
0:15:11 they might take three or four of those and just say, ugh, garbage, delete it. You know, or just start from
0:15:16 scratch, right? And then they might take a couple more of them, make a few refinements,
0:15:19 you know, check in on the code. And then there’s a couple others, maybe it just nailed it, right?
0:15:23 And those will kind of go straight through following a quick review. But if you think about it, like,
0:15:28 you know, they’ve essentially, you know, put kind of, you know, 10 additional engineers on their desk.
0:15:33 And the engineers have some interesting properties, right? Sometimes they are really impressive and
0:15:37 sometimes they’re astonishingly terrible, but they’re, you know, they’re effectively free,
0:15:42 you know, the cost of some, of some token generation. And so I think, you know, there’s very
0:15:46 different workflow that comes with vibe coding and, and, you know, obviously that has a, you know,
0:15:50 kind of profound impact on what a team can accomplish. I think the other interesting thing
0:15:56 about vibe coding is we’re increasingly starting to see people with different skillset profiles
0:16:03 contribute directly to the product, right? So, you know, rewind a year, if you’re a designer or you’re a product
0:16:07 manager, you’re probably not impacting production code except by influence, right?
0:16:12 But that’s changing, right? You can say, I think this should be this way and you can go and try it.
0:16:18 And, you know, we’ve built a lot of infrastructure here to enable these vibe coding use cases to be
0:16:22 accessible to a lot of people. So you can try something and, you know, it’ll put it in an
0:16:25 environment where you can test it. So you don’t have to spend, you know, two days setting up your
0:16:29 local machine. And then you can say, yeah, this is, this is, this is good. And then have an engineer
0:16:31 review the code and put it straight into production.
0:16:38 When it comes to things like quality assurance, security, other concerns around putting code into
0:16:45 production, is the answer, and I’m speaking from my own experience, kind of as a content creator and
0:16:51 using AI systems to help in a very kind of similar way to what you’re talking about, just my words are
0:16:57 content, not, you know, instructions to a machine. Is the answer as simple as having the right human in
0:17:03 the loop to review the output? And like you said, you don’t know how to incorporate this into the
0:17:11 workflow, launch 10 processes, review them, you know, discard, use, adapt. Is it that simple in sort
0:17:16 of at a high level? And what are some of the things kind of maybe more nitty gritty that you think about
0:17:21 when it comes to vibe coding, quality assurance, security, you know, and working with these large
0:17:21 code bases?
0:17:26 Yeah. You know, it’s interesting. I think there’s a lot of discussion about this. The way I see it,
0:17:32 it doesn’t change much, right? Like, and what I mean by that is, you know, human beings are
0:17:38 typically the, you know, the weak link in a system, right? I think building systems that are robust
0:17:44 from an infrastructure perspective, from a scale perspective, from a security perspective requires
0:17:49 you to be resilient to error. And when designed intentionally, you expect error. And so, you know,
0:17:55 I do think that, you know, good system design looks the same, you know, when vibe coding is involved
0:18:00 versus not. And arguably it’s more important, right? Because, you know, the impact is larger.
0:18:06 So, yes, I think you still need accountability mechanisms, right? Like the, you know, at the end
0:18:10 of the day, you know, we were very clear, you know, engineers are responsible for what comes out.
0:18:16 And, you know, just like rewind 10 years, somebody might’ve copied and pasted, you know, some code
0:18:21 from Stack Overflow. They’re accountable, you know, for that, whether they use that as a shortcut
0:18:25 or not. This is exactly the same in vibe coding, at least in terms of kind of creating the surfaces
0:18:30 that we have. Now, there’s security concerns, obviously, that are new and different, you know,
0:18:35 and, you know, when we do, you know, penetration testing in 2025, you know, we’re looking at things
0:18:41 like prompt injection and other things, you know, in terms of the operational surface. But if we really
0:18:45 kind of zoom in on the, on the vibe coding era, it’s like, yeah, absolutely. The humans are still
0:18:52 accountable for sure. Right. I’m speaking with Derek Slager. Derek is co-founder and CTO at Imperity,
0:18:59 a company that is doing, I think it’s safe to say, a variety of things to help their customers, other
0:19:05 companies, users understand and make the most of their data. We’ve been talking about engineering,
0:19:11 the developer experience. I want to ask you about one of Imperity’s tools, Chuck Data. Tell us about
0:19:16 Chuck Data. Yeah. So Chuck Data is something that we launched and, you know, it’s such an interesting
0:19:22 extension of our vibe coding conversation because, you know, we were, you know, experimenting and
0:19:28 understanding and learning the best ways to, you know, extract value out of the various kind of
0:19:33 interaction patterns around vibe coding. And then we asked ourselves what seemed like sort of an obvious
0:19:38 question, like, why is this only for software engineers? Like, you know, and, and, you know,
0:19:42 we looked at some of the things that, you know, our customers or even some of our own teammates were
0:19:46 doing related to customer data engineering, right? Where, you know, they’re pushing around, you know,
0:19:52 big, big mountains of SQL or, you know, trying to kind of manually build, you know, workflow
0:19:59 coordination patterns or, you know, maintaining, you know, huge kind of DBT code bases. And we thought,
0:20:04 whoa, there’s an opportunity here. Sure. And so, and so the idea behind Chuck Data is, hey, what if we
0:20:11 created a vibe coding tool for data engineers that are specifically focused on customer data use cases?
0:20:17 Okay. And so, you know, Chuck was kind of packaged up to be a very, very easy way for people to, you know,
0:20:22 in essence have these kind of like, you know, one shot experiences that you described earlier in sort
0:20:25 of the classic software engineering sense, but for customer data engineering use cases.
0:20:31 Are there competitors or there other, when you were building Chuck Data, were there reference points and,
0:20:36 and less asking about what those were, but what, what’s different about Chuck Data? What sets it apart?
0:20:41 Yeah. It’s such an interesting question because the thought I had, you know, as we were kind of
0:20:46 forming Chuck and, and, and no doubt we took a lot of inspiration from Cloud Code, you know, that’s,
0:20:51 that’s definitely kind of, you know, number one in the, in the Amparity rankings of vibe coding tools.
0:20:56 And so we took a little bit of inspiration from, from, from that though. Cloud of course is, is,
0:21:02 is aimed at more, you know, traditional software engineering. But I expected, you know, there was a
0:21:06 point in time where I was like, okay, let me go find what other people are doing. I’m sure other people
0:21:10 have thought of this and I couldn’t find them, you know, and perhaps they, you know, they, they did
0:21:15 exist and hadn’t made their way up the, uh, you know, SEO and GEO rankings, but, but I literally
0:21:19 couldn’t find it. Yeah. Right. Trying that. And so, which is always kind of an exciting and scary
0:21:25 feeling. Yeah. Yeah. Yeah. I was like, hooray. And then I was like, Oh, what am I missing? Yeah.
0:21:30 Right. Are we too early? Yeah. Right. And so, yeah, no, we, we, we leaned in and, and, and I think,
0:21:35 you know, Amparity, we always, we always really think about things starting from first principles. Right. And it’s like,
0:21:39 hey, we can help people really accelerate some things that we know they’re, uh, they’re going
0:21:43 to be challenged with. And, and, and of course that’s valuable. So let’s, uh, let’s lean into
0:21:47 that and, and, uh, yeah, I’m sure the competitors will follow eventually, but yeah, I was a little
0:21:51 bit surprised to have trouble finding. Where does the name come from? Uh, the name, uh, it’s
0:21:57 inspired by, you know, one of our very early engineers who, you know, kind of did a lot of
0:22:02 work on, uh, some of our, you know, initial R and D. And so we kind of wanted to, you know,
0:22:07 have a little nostalgic throwback. Yeah. Awesome. Yeah. Very cool. So talking about
0:22:14 vibe coding and AI powered workflows and the experience of a developer and, you know, a new
0:22:18 developer kind of learning the trade and getting their feet wet and experienced developer, as
0:22:23 you said, kind of rethinking their workflows and how they do things. There’s concern in some
0:22:28 circles that, you know, again, I’ll, I’ll relate it back to my own experience, similar to
0:22:34 how there’s concern that students, adults, professionals for that matter, are using AI
0:22:41 to do their writing for them. Concern around vibe coding, AI powered tools, kind of, um, enabling
0:22:46 programmers, if you will, to skip the process of really learning to understand how the code
0:22:51 works, how to, how to structure applications, how to do all those things. What’s your take on
0:22:57 that? Are there, are there risks of over-reliance? Is it just kind of where the wind’s blowing
0:23:02 and will adapt, you know, what’s your take and what’s Amparity doing, uh, to make sure
0:23:06 that, you know, programming doesn’t turn into just hitting tab repeatedly and then hitting
0:23:13 ship? Yeah. The concerns sound familiar to me. You know, I’ve been around a little while
0:23:18 and, you know, my career, uh, uh, started in the late nineties and, you know, you know, a
0:23:21 lot of the conversation in the late nineties was like, ah, you know, these kids today with
0:23:25 their high level languages, right. They don’t even, you know, they don’t even know how to
0:23:29 do pointer math and then, uh, you know, and then it sort of moved on to all these kids
0:23:35 with their IDEs, with their fancy autocomplete kids with their, you know, garbage collection
0:23:39 language. Back in my day, you know, we used to manage memory ourselves, et cetera, et cetera,
0:23:43 et cetera. And so, you know, it sounds a little bit like the, uh, the, the grumbly old person
0:23:48 rant, you know, totally a little bit. Sure. Like vibe coding has a bunch of properties that
0:23:52 allow you to create a bunch of code quickly. And I think, you know, there’s many prominent
0:23:55 examples that aren’t hard to find of people who didn’t understand what was happening there
0:24:01 and then really bad things happened. Right. And so in some sense, back to maybe the accountability
0:24:06 point from before, nothing really changes, right. You’ve got some tools. It allows you to, to,
0:24:11 to work differently, to work faster, but like you’re ultimately accountable and it’s not optional,
0:24:16 you know, to understand how that code works. It really isn’t. And I think even, you know, if
0:24:21 we look over the horizon and imagine a step change in the model and, you know, kind of
0:24:25 more, you know, agentic sort of verification and validation workflows, it’ll get easier,
0:24:29 it’ll get faster. Uh, but at the end of the day, you know, I think, I think society is built
0:24:34 around the notion that, you know, humans are going to have accountability for, uh, for the
0:24:39 things they do. Right. And, and, you know, uh, I don’t, I don’t really think that changes.
0:24:43 And so I think it’s a great new set of tools. I celebrate the great new set of tools. It allows
0:24:48 us to, you know, build more faster for our customers. And I think that’s amazing and awesome.
0:24:54 So along those lines, then what will the engineers, what will the coders of tomorrow need? How,
0:24:59 how does the skillset change? How does the mindset, the approach to, you know, constructing something
0:25:04 new, working from existing code base, all of those things, how does that change for the folks coming
0:25:11 up now? I think it’s going to be really different. I think we’re designing a new way to build software.
0:25:16 And, and, and I really mean that, you know, the workflow is different. And I think that the skill
0:25:22 sets that matter are also different. I think maybe the best engineers of 2015 won’t be the best
0:25:29 engineers of 2026, if that makes sense. It does, but why? Because I think, you know, once upon a time,
0:25:34 right, that developer who could master that algorithm, you know, or had this kind of
0:25:38 like, you know, deep arcane knowledge of how a particular subsystem worked was, you know,
0:25:45 especially valuable. But that skillset doesn’t sort of automatically adapt to, you know, how do I go
0:25:50 and, and kind of build a whole network of different processes that are happening, right? It’s almost
0:25:56 like, you know, you’re going from being, you know, an expert artisan to a general contractor. That’s a
0:26:01 different job. And they’re both important jobs, but it’s a very different job. And being a, you know,
0:26:06 great general contractor in a large complex, you know, problem space where you have lots and lots
0:26:10 of subcontractors and you need to kind of orchestrate that all together is just different than, you know,
0:26:15 kind of the core craft. And so, and look, we’re still learning as an industry, what that skillset
0:26:19 ultimately looks like, but I think it’s going to create a lot of opportunities for people,
0:26:23 you know, who maybe wrote themselves out of the, you know, engineering game once upon a time.
0:26:29 And I think the, the, the skill sets are going to look different. So, you know, and I’m seeing some,
0:26:33 I’m seeing some early evidence of that. And I expect that’ll continue to evolve, you know,
0:26:35 at Imperity and in the industry as a whole.
0:26:41 So how do we, how does Imperity approach keeping engineers at the center of the process
0:26:47 as these tools change? But as you said, you know, the human accountability and then sort of the,
0:26:51 it’s not the other side really, but maybe at the beginning of that process,
0:26:56 the spark of innovation and the idea of humans innovating on our own, working together, using
0:27:03 tools. Yeah. How does Imperity approach, you know, is there a philosophy around keeping the engineers
0:27:09 at the forefront and not sort of having them sidestepped by increasingly advanced automation?
0:27:14 Yeah, I think it’s a fair question. And, and, and admittedly today that’s pretty easy,
0:27:19 you know, because, you know, there’s only very small components of the, of the engineering
0:27:23 life cycle and the, and the product design life. So to your point, the ideation and things like that,
0:27:28 that could be, you know, at least in theory, automated away. And so, you know, today, obviously,
0:27:33 you know, humans are still firmly planted in the, in the, in the feedback loop and, and they’re driving,
0:27:39 driving that process entirely. You know, I think I can close my eyes and look over the horizon and
0:27:45 imagine some pieces evolving that, but I think it’s still, you know, largely rounds to the same. And,
0:27:49 and I think, you know, it’s sort of like, you know, if I look at sort of the backlog of ideas,
0:27:54 you know, that people have for how to make Imperity better, right. It’s like, there’s thousands of good
0:28:00 ideas, you know, for how to make Imperity better. Right. And so, you know, in some sense, the challenge
0:28:05 is always curation, having better tools to, to, to help with that are wonderful. And I think, you know,
0:28:10 we all still are going to value sort of that, you know, kind of human touch. And, and, and I think we’ll,
0:28:16 we’ll expect and appreciate that that human touch is going to be enhanced, augmented and made better
0:28:22 by the inclusion of AI tools. But, but, but I think we’re a long ways out from, you know, kind of having,
0:28:28 having humans out of the loop, particularly in software engineering and customer data engineering use cases,
0:28:33 because, you know, the difference between, you know, right and accountable, I think is just sort of fundamental
0:28:40 to the model. Well said. So shifting from the future and kind of coming back for a moment,
0:28:44 if you look across the work Imperity is doing, the work Imperity is done with clients,
0:28:51 specifically talking about agents and agentic systems, any big wins, any really exciting little
0:28:56 stories you can share or learning moments, if not a win, something that, you know, kind of really stuck
0:28:59 out. That’s really informing how you look ahead.
0:29:05 Yeah. I think the biggest wins really fit the category of, you know, enabling people to do things
0:29:09 they didn’t think they could do. And, and that might be, you know, from kind of the examples
0:29:14 earlier where people can, you know, analyze data that weren’t able to do it before, or people who
0:29:18 kind of always told themselves that we can’t do really sophisticated segmentation because we’re
0:29:24 bottlenecked on creative resources. And so, you know, being able to empower people, you know,
0:29:27 to do things that they couldn’t do before that ultimately are good for their customers and that
0:29:31 they’re motivated to do. I mean, those are the things that are the big wins for me and get me
0:29:37 excited. I think in terms of learnings, business context matters a ton. And I think that’s one of
0:29:41 the really kind of, you know, it’s sort of an obvious thing, but, but it’s really important to
0:29:46 design for that. You know, it’s like, uh, I was recently talking to, uh, you know, one of our,
0:29:51 one of our customers that sells cars, right. And, and, and if you’re a customer that sells cars
0:29:57 and you, you refer to a product as a taco, uh, you’re probably talking about a Toyota Tacoma,
0:30:02 right? Because that’s a, you know, that’s their shorthand for Toyota Tacoma. Uh, and if you’re in
0:30:06 a quick serve restaurant and you say taco, it means something completely different, right? To choose a
0:30:11 silly example. But, you know, there’s so many of those things and we see that, you know, when, you know,
0:30:16 we watch people interact with the system there, there’s a language of every company and the way
0:30:20 they think about things, particularly when they’re asking questions about customer data are really
0:30:25 infused with the language. And so, you know, a big learning is, you know, the faster we can kind of
0:30:33 bootstrap the LLMs with customer specific knowledge, it’s a, you know, it’s a huge step change in kind of
0:30:39 the efficacy, uh, and the empowerment, which kind of feels more of those wins. Yeah. Yeah. So you’ve
0:30:43 kind of touched on this, but I’m gonna ask you to sort of, uh, to use your word, kind of curate some of
0:30:47 the things we’ve talked about and, and, you know, some of the other things I’m sure you’re seeing day to
0:30:53 day, what gets you excited about what’s coming next specifically in the enterprise? And, and what
0:30:58 are some of the things when you’re working with say new clients or just talking to folks who are kind
0:31:03 of on the forefront and new adopters, what are some of the things that not only excite you about what’s
0:31:09 coming, but that you think are really key for folks to keep in mind as they start exploring newer and,
0:31:14 and just, you know, as we said before, everything’s moving so fast, these increasingly advanced tools.
0:31:20 Yeah. I think what’s exciting to me and what we’re really seeing is I think people started out by
0:31:25 saying, how can AI make the thing I’m already doing go faster? Right. And that’s a great place to start.
0:31:29 Uh, but I think what’s exciting is people are starting to rethink their businesses,
0:31:34 right? Vibe coding causes you to rethink your developer workflow, you know, but, but what if
0:31:39 you’re planning a marketing campaign, you know, well today in 2025, that probably still looks pretty
0:31:43 similar, you know, but when we look ahead and particularly some of the things that, you know,
0:31:49 we’re working on with, with our customers, like we can go from making tasks go faster to really helping
0:31:54 to, uh, you know, sort of re-inform the strategy and how that comes to life. And I think, you know,
0:31:58 obviously, you know, anybody can go to their favorite LLM and ask it some questions about marketing
0:32:05 strategy and that’s fine. But the thing that’s so exciting is integrating that experience into a system
0:32:11 and a platform that has the data, that has the business context, that knows what’s happening,
0:32:15 can close the loop. And you start to kind of like close your eyes and think about that.
0:32:21 Wow. Like that’s really exciting at that point. You know, it’s not just about making that task go
0:32:25 faster. It’s about making your business go faster. And I don’t think that’s hyperbole at all, right?
0:32:32 Like right around the corner for us. And so, you know, I think we’re extremely excited about some of the
0:32:37 opportunities ahead and extremely excited that our customers are, are, are really kind of pushing us,
0:32:40 uh, to be able to do that as quickly as possible.
0:32:47 So along those lines, best practices, words of advice for listeners out there, for teams who
0:32:53 want to harness agentic AI in particular, but want to be mindful of the things we’ve talked about,
0:32:56 innovating, being accountable, being responsible.
0:33:03 Yeah. I would say the one piece of advice, and I give this advice a lot is start now. Um, and-
0:33:04 We’ve heard that one before.
0:33:10 Yeah. It’s so important. It’s so important because like, it’s early, right? We’re still figuring out
0:33:16 the patterns and the practices, you know, like as an industry, we’re learning a lot, um, about kind of
0:33:22 how to, you know, put these, uh, incredible new technologies together in ways that, that, that really,
0:33:27 uh, you know, move the needle and, you know, right now you just have a choice, right? You can be,
0:33:31 you can be a doer who’s in that learning loop, or you can be an observer and kind of, you know,
0:33:35 wait and see. And I think, uh, you know, we talk a lot about this here, like, you know,
0:33:40 speed’s the only thing that matters. And so, um, I don’t think it’s viable in the, in the current
0:33:44 market to be outside that learning loop. Right. Um, and the good news is it’s early, right? And so
0:33:50 you’re not too late, but it’s getting to the point where pretty soon you will be too late. And so I
0:33:54 think, you know, uh, I think we’re certainly past the point. And again, this is something that’s
0:33:58 changed in the last six months. We’re past the point where people are like, well, we’ll see if
0:34:04 this AI thing plays out or not. Like it’s just, it’s overwhelmingly obvious, uh, where things are
0:34:10 going. And so, yeah, get off the sidelines, get in there, try stuff, learn. It’s easier than ever,
0:34:15 you know, to do that. There’s more information out there. And of course, you know, AI feeds itself,
0:34:19 right? AI can also help people figure out where to start and how to get through. And so, uh, yeah,
0:34:25 uh, start now and go really fast. That’s the path to success. Fantastic. So to put a point on that
0:34:30 for folks listening who are like, yeah, I’m ready to go. I want to start by learning more about
0:34:35 Mparity. Yeah. Best places for them to go website, social media, uh, where should they start?
0:34:40 Yeah. Go to Mparity.com. I hope many of those people are thinking to themselves,
0:34:44 wow, I’d love to work at Mparity. So, you know, hit that careers page. Awesome. Uh, you know,
0:34:50 we’re in a growth mode and, and hiring, uh, hiring for people, especially, uh, certainly people who maybe
0:34:54 have passion about how to, uh, bring AI to customers and ways that really move the needle on their
0:34:59 business across a variety of roles. So yeah, we’d love to see people visit that page. And, uh, yeah,
0:35:04 certainly we, uh, you know, are fairly open with information on our website about what we do and how
0:35:08 we do it. And, you know, we’ve got, uh, a lot of work we’ve done over the, over the years with
0:35:12 published papers and patents and other things. So, you know, we love engaging with people who are kind
0:35:17 of just interested in the space and, you know, to use a phrase that gets used a lot around Mparity,
0:35:21 we love nerding out with people. Awesome. Derek Slager, this has been a great conversation
0:35:27 and I think just a lot of wisdom and a lot of, um, you know, really practical advice that really
0:35:33 resonates on how quickly things are moving, how important it is to get started. And, um, just the,
0:35:38 for all that we can do now, the possibilities, even in the very near term, or just, as you said,
0:35:42 close your eyes and imagine, and it’s, it’s really something. Yeah, for sure. Well, I appreciate
0:35:46 you having me, Noah. It’s a fun conversation. Appreciate you having on. Let’s do it again down
0:35:47 the road. Sounds great.
0:36:05 Thank you.
0:36:06 Bye.
0:36:06 Bye.
0:36:06 Bye.
0:36:06 Bye.
0:36:06 Bye.
0:36:06 Bye.
0:36:06 , Bye.
0:36:07 Bye.
0:36:07 Bye.
0:36:07 Bye.
0:36:07 Bye.
0:36:07 Bye.
0:36:07 Bye.
0:36:07 Bye.
0:36:07 Bye.
0:36:07 Bye.
0:36:37 Thank you.
0:37:07 Thank you.

Derek Slager, co-founder and CTO of Amperity, explores how agentic AI and vibe coding are reshaping enterprise data management and the developer experience on the NVIDIA AI Podcast. Hear how Amperity’s platform unifies customer data, powers advanced analytics, and brings conversational interfaces to every part of the organization—helping brands activate, segment, and leverage insights at scale. Discover why data quality matters more than ever, how agentic AI transforms workflows, and why human accountability stays central in the age of automation.

Learn more at ai-podcast.nvidia.com.

Leave a Comment