a16z
July 21, 2025

The Future of Software Development - Vibe Coding, Prompt Engineering & AI Assistants

The a16z infra team breaks down how AI is not just a new feature but a fundamental fourth pillar of infrastructure, built on top of compute, storage, and networking. This new pillar is fundamentally different because it’s the first time in history that developers are outsourcing logic itself, a shift that is causing software to disrupt itself.

AI: The Fourth Pillar That's Eating Software

  • "I don't remember ever in the history of computer science where we've, from an application standpoint, abdicated logic... the logic, the yes or no, the what it's doing, always came from the programmer. But in these ones, we're like, 'come up with the answer for me.'"
  • "One of the most exciting things about the AI wave is that software is being disrupted. Like, we're being disrupted."
  • For the first time, AI models are being treated as a new infrastructure primitive that handles reasoning and logic, not just resources. This forces a complete reimagining of the software stack and the role of the developer.
  • The industry is moving from a world where programmers defined every explicit instruction to one where they guide and orchestrate intelligent systems, effectively becoming the managers of AI logic. This shift has led to the "consumerization" of developer tools, where adoption patterns look more like B2C than traditional enterprise sales.

Supercycles, Moats, and Non-Zero-Sum Games

  • "I would encourage anybody that does invest at least in infrastructure to not think zero-sum and to realize that historically every layer of the stack has maintained some level of value and margin."
  • The current AI boom mirrors past supercycles like the internet and cloud, defined by a massive expansion of the Total Addressable Market (TAM) and the creation of entirely new user behaviors that incumbents are unprepared for.
  • Early theories of "no defensibility" in AI have been debunked. Value is being captured at every layer—from chips (Nvidia) to models (OpenAI) to apps (Midjourney). The market is in an expansion phase where new value is being created everywhere, not a consolidation phase where players fight over a fixed pie.

The Augmented Programmer

  • "It's really not about prompt engineering; it's context engineering... You have to know what to put in the context in that prompt."
  • "The reason people buy software is because somebody else made the decisions of what the workflow should be... That turns out to be a much harder problem than actually building it."
  • The key skill for developers is shifting from "prompt engineering" to "context engineering." This involves architecting systems that can feed models the correct data and tools to generate reliable, high-quality outputs.
  • AI assistants and agents are already proving effective in constrained domains like coding, where a "loop" of generation and error-correction (e.g., compiling, linting) is possible. This boosts productivity and allows developers to operate at a higher level of abstraction.

Key Takeaways:

  • The core job of a developer is shifting from writing lines of code to defining user needs and architecting intelligent systems. This is a monumental change, but one that creates more opportunity, not less.
  • AI Isn't a Feature; It's a New Infrastructure Primitive. For the first time, developers are outsourcing logic, not just resources. This fundamentally changes how software is built, valued, and sold.
  • Abandon Zero-Sum Thinking. The AI market is in a massive expansion phase, not a consolidation battle. Value is accruing at every layer of the stack simultaneously; assuming one layer's gain is another's loss is a flawed thesis.
  • The Future is More Developers, Not Fewer. AI tools augment productivity and lower the barrier to entry. This elevates the developer's role to focus on product design and workflow definition—the real hard problem in software.

For further insights and detailed discussions, watch the full podcast: Link

This episode reveals that AI is not just another tool but a fundamental new infrastructure layer that is actively disrupting the very profession of software development, forcing a complete re-evaluation of how we build, program, and invest in technology.

Defining the New Frontier: What is Infrastructure in the AI Era?

  • Core Pillars: Traditionally, infrastructure consists of three pillars: compute, networking, and storage. It also includes the entire toolchain that supports the software development lifecycle.
  • AI as the Fourth Pillar: Jennifer introduces AI models as a definitive fourth pillar of infrastructure. Models are not just applications but foundational components that leverage and stress the other three pillars, demanding massive compute, new data architectures, and low-latency networking.
  • A Fundamental Shift in Programming: Martin argues that a new infrastructure layer is defined by its ability to change how we program computers. AI models do this by introducing a radical new concept: the abdication of logic. For the first time, programmers are not just requesting resources but are asking the system to generate the logic itself.
    "I don't remember ever in the history of computer science where we've from an application standpoint we've abdicated logic... the what it's doing always came from the programmer but in these ones we're like come up with the answer for me." - Martin

Software Is Eating Itself: The Great Disruption

  • A New Paradigm: The team agrees this is one of the most significant shifts in their careers. The old rules and analogies—comparing models to databases or networks—are insufficient. The industry is starting from a "blank sheet of paper" to figure out how to program and build with these new, non-deterministic systems.
  • Infrastructure is Layered, Not Replaced: A key principle that still applies is that infrastructure is never discarded; new layers are simply added on top. Existing compute, networking, and storage remain critical, but they are now being augmented and redefined by the AI layer.

Lessons from Past Tech Super Cycles

  • TAM Expansion: New infrastructure drastically lowers the marginal cost of a key resource (e.g., distribution for the internet, computation for AI), leading to a massive expansion of the Total Addressable Market (TAM).
  • Emergence of New Behaviors: This expansion brings in new users and enables entirely new behaviors that existing companies are ill-equipped to serve. This creates a "white space" for startups and challengers to innovate and capture market share.
  • The Promise of Low-Code, Realized: Jennifer, a long-time advocate for low-code/no-code tools, notes that AI is finally delivering on this promise. Natural language is becoming the new programming interface, empowering a much broader audience to create software and prototypes without deep technical training.

The Evolution of an Infrastructure Investment Practice

  • The Technical Buyer: Unlike vertical SaaS sold to specific industries (e.g., flooring, construction), infrastructure is typically horizontal and sold to a centralized, educated technical buyer (like a CTO or head of engineering). This requires a different type of diligence, market analysis, and expertise.
  • The Developer as the New Consumer: The speakers highlight a "horseshoe theory" where the infrastructure buyer is looping back to resemble a consumer. With over 50 million developers globally, adoption is increasingly bottom-up, driven by individual preferences and Product-Led Growth (PLG) motions. Marketing and sales to developers now look more like consumer marketing than traditional enterprise sales.
    Strategic Implication: For investors, this means evaluating developer-facing tools requires understanding consumer-like adoption funnels, community building, and individual user experience, not just enterprise sales cycles.

Mapping the Modern Infrastructure Landscape

  • Key Categories:
    • Developer Tools: Products that improve developer productivity. Examples include GitHub and Cursor.
    • Core Infrastructure: The foundational compute, network, and storage layers.
    • Data Systems: A major focus area, including data engines (Databricks, Spark) and analyst-facing tools (dbt, Hex).
  • The Blurring of Infra and Apps: In a new wave like AI, the technology itself often becomes the application. Companies like Midjourney or OpenAI function as both infrastructure providers (offering a model) and application companies (offering a user-facing product).
    "It's very hard to answer if OpenAI is an app company or it's an infra company. It literally is building infrastructure that's like a cloud running these models... but at the same time building a consumer app that's chat GPT." - Jennifer

The Defensibility Fallacy in an Expanding Market

  • Expansion Phase Dynamics: Martin argues that in an expansionary phase, "zero-sum thinking is deadly." The market is growing so rapidly that companies at every layer of the stack—from models to cloud providers to applications—are capturing immense value and maintaining margins.
  • The Myth of Commoditization: Infrastructure layers rarely disappear; they consolidate into oligopolies or monopolies (e.g., the cloud providers) that maintain pricing power. Switching costs are also deceptively high, as logic and systems become deeply integrated with a specific infrastructure component.
  • Actionable Insight: Investors should not be deterred by arguments of "no defensibility" during this expansion phase. The focus should be on identifying companies that can capture a piece of the rapidly growing pie, as value is accruing across the entire stack.

From Prompt Engineering to Context Engineering

  • Context is King: The performance of an AI model is highly dependent on the quality of the context provided in the prompt. This involves using traditional computer science—indexes, prioritization, and data pipelines—to feed the model the right information at the right time.
  • A New Software Pattern: This emerging discipline of context engineering is a prime example of how a new infrastructure piece (AI models) creates new, formal methods for building systems. This points to investment opportunities in tools that help manage and optimize this context.
  • The Role of Data Pipelines: Jennifer emphasizes that this new pattern reinforces the timeless importance of data pipelines. The challenge is feeding the right data and context into models, which is a classic, unsolved infrastructure problem now supercharged by AI.

The Future of Developers and the Nature of Coding

  • More Developers, Not Fewer: History shows that powerful new tools don't shrink the market; they expand it by making development more accessible and productive. AI will create more software and, therefore, more need for people to build, maintain, and design it.
  • The Enduring Value of Product Decisions: The hardest part of software is not writing the code but making the thousands of decisions about workflow, user needs, and operational logic. This requires human domain expertise and product sense, which AI cannot replicate.
    "The reason people buy software is cuz somebody else made the decisions of what the workflow should be and what the operational logic should be... that just doesn't go away independent of how you create the software." - Martin
  • The State of AI Agents: The team offers a pragmatic view on AI agents. They are highly effective in domains with built-in error correction, like coding (where you can compile, lint, and test). However, for open-ended tasks like general web browsing, propagating errors make them unreliable.

Conclusion: A New Era of Building

This conversation underscores that AI is a seismic infrastructure shift, creating a non-zero-sum environment where value is accruing at every layer. For investors and researchers, the key is to focus on the new programming paradigms and developer behaviors emerging, as these are the leading indicators of where durable companies will be built.

Others You May Like