a16z Podcast
Vishal Misra returns to explain his latest research on how LLMs actually work under the hood. He walks through experiments showing that transformers update their predictions in a precise, mathematically predictable way as they process new information, explains why this still doesn’t mean they’re conscious, and describes what’s actually required for AGI: the ability to keep learning after training and the move from pattern matching to understanding cause and effect.
Resources:
Follow Vishal Misra on X: https://x.com/vishalmisra
Follow Martin Casado on X: https://x.com/martin_casado
Stay Updated:
Find a16z on YouTube: YouTube
Find a16z on X
Find a16z on LinkedIn
Listen to the a16z Show on Spotify
Listen to the a16z Show on Apple Podcasts
Follow our host: https://twitter.com/eriktorenberg
Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.
Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

-
The Question of Education
Monopoly, oligopoly, cartel. All three of those words can describe the (not so) modern education system today, given the cost structures, economics, and accreditation capture — in everything from who can and can’t start a…
-
Pandemic Relief and Fraud: Willful Deceit or Design Defect?
This episode examines the potential for misuse and fraud among those applying for the Paycheck Protection Program (PPP)—and how fintech and software provide overlooked tools to stop it. On March 27th, the government enacted a…
-
Measuring & Managing Community Orgs, Developer Relations and Beyond
Okay, so we know community is important — whether for developer relations for your product or other types of communities — but how do we measure the success of community initiatives and even artifacts (like…
-
Reining in Complexity: Data Science & Future of AI/ML Businesses
There is no spoon. Or rather, “There is no such thing as ‘data’, there’s just frozen models”, argues Peter Wang, the co-founder and CEO of Anaconda — who also created the PyData conferences and grew…
-
Online Learning and the Ed Tech Debate
This episode is all about education and technology, a topic that’s especially top of mind this week as students in much of the country return to school—virtually. The intersection of learning and technology has been…
-
On Vaccines and Vaccinology, in COVID and Beyond
WHEN are we going to have a COVID-19 vaccine, and how the heck are we going from (what’s been traditionally been up to) 12 years or so of vaccine development compressed into 12 months or…
-
Turning Open Source Developers Into Superfans
In this episode, we continue our community series with a recent discussion that applies to many kinds of community building. Today’s topic: How do you create a platform that people not only use, but tell…
-
Journal Club: Slaying the Sleeper Cells of Aging with CAR T
CAR T therapy is a groundbreaking medicine that uses engineered T cells to attack cancer. But CAR T cells (that is, chimeric antigen receptor T cells) can be programmed to recognize a huge range of…
-
Working, Making, Creating in Public… and Private
We’re living in an unprecedented era of online collaboration, coordination, and creation. All kinds of people are coming together — whether in an open source project or company, an R&D initiative, a department in a…
-
GPT-3: What’s Hype, What’s Real on the Latest in AI
In this episode — cross posted from our 16 Minutes show feed — we cover all the buzz around GPT-3, the pre-trained machine learning model from OpenAI that’s optimized to do a variety of natural-language…
