Summary & Insights
If you think of the internet as a global brain, APIs are the dendrites that connect its neurons, allowing different systems to “think” and act together. This foundational idea frames a conversation with Alex Rattray, founder of Stainless, about how the very nature of these connections is undergoing a seismic shift. The discussion centers on the evolution from APIs designed solely for human developers to interfaces that must also serve a new class of user: AI agents. Drawing from his experience building Stripe’s revered developer platform, Alex explains why a high-quality Software Development Kit (SDK) is often the true API for developers, providing the type safety, idiomatic design, and robustness that make integration possible. Yet, the rise of AI, particularly through protocols like the Model Context Protocol (MCP), introduces novel challenges and forces a reimagining of what an intuitive interface looks like when the user is a large language model.
The core of the challenge lies in the fact that AI agents interact with APIs fundamentally differently than humans. While a human developer can navigate extensive documentation, an LLM can be overwhelmed by the sheer volume of potential endpoints and parameters, quickly exhausting its limited context window. This requires a new paradigm for API design—one that dynamically reveals functionality and filters responses to keep interactions lean and focused. The conversation explores practical solutions, such as allowing users to select only the API resources they need, creating dynamic tool discovery flows, and employing JSON query (JQ) filters to let the AI request only the specific data fields relevant to its task, thereby preserving precious context.
Ultimately, this shift doesn’t diminish the importance of traditional developer experience; it amplifies it. As AI coding agents become more common, they exhibit a strong preference for using well-typed, polished SDKs over making raw HTTP requests, as the SDKs provide guardrails and reduce hallucination. This creates a virtuous cycle where the need for AI-integration accelerates demand for the very best developer tools. The future, as Alex envisions it, involves developers focusing on high-level business logic and API design, while robust platforms and frameworks—like the one Stainless is building—handle the intricate, low-level infrastructure of SDK generation, documentation, and versioning, catering seamlessly to both human and machine users.
Surprising Insights
- The SDK is the API: For many developers interacting with a service daily, the SDK is their primary interface, not the underlying REST API. Its quality—being idiomatic, robust, and well-documented—directly defines the developer experience.
- AI agents prefer SDKs: Contrary to the assumption that LLMs would work best with simple HTTP endpoints, coding agents actively seek out and prefer to use typed SDKs because they provide structure, reduce errors, and allow for faster iteration within a local environment instead of making risky “testing in production” API calls.
- Context window is a major design constraint: The most significant hurdle in designing APIs for LLMs isn’t functionality, but the limited context window. Presenting a full API spec with all parameters is often impossible, forcing new strategies for dynamic, just-in-time tool and parameter discovery.
- AI unlocks new tooling for efficiency: Simple additions, like equipping an AI with a JQ (JSON Query) filter capability, allow it to request only the specific data it needs from a large API response, dramatically improving efficiency and keeping context usage light.
Practical Takeaways
- Treat your SDK as a first-class product, investing in its design, type safety, and documentation, as it is the main touchpoint for both human developers and AI coding agents.
- When building for AI agents via protocols like MCP, design for dynamic discovery. Instead of dumping your entire API spec into the context, create systems that allow the agent to list, describe, and then execute endpoints in separate steps to conserve tokens.
- Implement filtering mechanisms like JQ in your API responses to enable AI agents to request only the specific data fields they need, preventing large payloads from breaking the interaction.
- Prioritize making your API’s functionality selectively exposable, allowing users (or the AI itself) to enable only the specific resources (e.g., only customers, only read operations) they need for a given task to reduce initial complexity.
- Ensure your documentation is machine-readable and version-accurate so that AI coding agents can access the correct reference material, reducing hallucinations and integration errors based on outdated SDK versions.
In this episode, we continue our community series with a recent discussion that applies to many kinds of community building. Today’s topic: How do you create a platform that people not only use, but tell their friends about? One that goes beyond just being useful and actually connects deeply with the user? In this discussion, which was recorded at our Crypto Startup School in April 2020, a16z General Partner Chris Dixon talked about building communities — specifically, communities of open-source developers — with GitHub cofounder Tom Preston-Werner. They discussed how to engage early users, how to turn them into your biggest advocates, how to create superfans, and more. Today, GitHub is the leading community for open-source developers and others. They also discuss in-person communities vs. distributed communities, a topic that is very top of mind today.

Leave a Reply
You must be logged in to post a comment.