The Rollup
July 21, 2025

The Monad Thesis With James Hunsaker

James Hunsaker of Monad Labs, a veteran of high-frequency trading at Jump, unpacks the trade-offs between speed, decentralization, and user experience. This deep dive explores why most people misunderstand performance and how Monad aims to rebalance the on-chain economic game.

The Latency vs. Throughput Fallacy

  • "You could have a system that produces one block every hour, but that block has trillions of transactions in it. By definition, that chain would have the highest throughput. It has very high latency but has extremely high throughput."
  • "I feel like people are under-indexing on latency and over-indexing on TPS and these types of metrics."
  • Performance is a misunderstood concept. High throughput (TPS) is about handling "bursty" demand, like thousands of users placing sports bets simultaneously. Latency, the time to confirmation, is what market makers feel. For retail users, latency below a few hundred milliseconds is imperceptible.
  • The obsession with minimizing block times is a race to the bottom for high-frequency traders. Monad’s inspiration came from the need for high throughput to handle user volume, not from a need to win nanosecond latency games.

The L1 Gauntlet: Decentralization vs. The Centralized Speed Run

  • "Monad isn't designed for a CLOB. It's a general-purpose blockchain similar to Ethereum or Solana. Otherwise, we would have built Hyperliquid."
  • "If you have a centralized server, you can put your HFT firm right next to it, and you will guaranteed beat all the retail people... That game is a race to single-digit nanoseconds."
  • Centralized L2s with single sequencers create a latency arms race where proximity is everything. This architecture gives an inherent advantage to traders who can co-locate their servers, turning the market into a pure speed game.
  • Monad’s thesis is that a globally distributed, decentralized L1 is worth the trade-off. By making it harder to win on latency via co-location, the system fosters a fairer competitive environment without sacrificing the high-performance UX that users demand. It bets on composability and shared security over isolated speed.

Rethinking On-Chain Economics

  • "I think chains should have generalized frameworks by which these contracts and intents can be specified, executed, and enforced. I think that's like the next generation of this stuff."
  • The current model sees massive value leakage from applications like pump.fun to base-layer validators via MEV. This creates a fundamental incentive misalignment between those who build value and those who secure the network.
  • The solution may be "application-specific sequencing," where apps can programmatically define their own transaction ordering rules. This allows them to internalize value and forces validators to play by their rules to earn their fee revenue—an on-chain, game-theoretic negotiation.

Key Takeaways:

  • The conversation delivers a masterclass in the nuanced trade-offs of blockchain design, moving beyond simplistic metrics to the economic and philosophical core of what we’re building. It argues that the right architecture can deliver both elite performance and a fairer, more decentralized foundation.
  • Performance is Not Just Speed. Stop obsessing over TPS and block times. The critical challenge is handling bursty user demand while providing deep liquidity and tight spreads—qualities that stem from a well-designed system, not just raw speed.
  • Decentralization is a Deliberate Economic Choice. Monad bets that a globally distributed L1, while trading off the raw co-location speed of a centralized L2, creates a more robust and equitable economic system. It’s a feature, not a bug.
  • The Value Chain is Being Rebuilt. The future belongs to platforms that allow applications to capture the value they create. "Application-specific sequencing" is the next frontier, shifting power from validators back to the builders.

For further insights and detailed discussions, watch the full podcast: Link

This episode dissects the fundamental trade-offs between decentralized L1s and specialized L2s in the race for high-throughput, revealing what truly matters for building the next generation of on-chain applications.

The Genesis of High-Throughput Demand

  • James Hunsaker, drawing from his extensive market experience, identifies the legalization of sports betting as a key inspiration for his work on high-performance blockchains. He observed the high-frequency, "bursty" nature of in-game betting, where thousands of users place bets simultaneously during peak moments. This real-world scenario highlighted a critical gap in the market.
  • This use case, characterized by massive, concurrent user activity, demonstrated that no existing blockchain could handle such peak loads.
  • The core problem was not just average transaction speed but the ability to manage intense, concentrated bursts of demand without failure.
  • James Hunsaker: "I was thinking of like a ton of people sitting at home with their cell phones just like betting... and you start like thinking through numbers... and like could the chain handle that and the answer is no."

AI Agents and the Future of Transaction Volume

  • The conversation expands to include AI agents as another major driver for high-throughput demand. James notes that the concept of autonomous agents transacting on behalf of humans, while discussed for decades, is becoming a more tangible reality that will stress current infrastructure.
  • AI agents interacting with each other could multiply transaction volume far beyond human capacity.
  • This creates a future where high-performance, scalable infrastructure is not just an advantage but a prerequisite for a new class of automated, on-chain economies.
  • Actionable Insight: Investors and researchers should view AI agent activity as a future multiplier on transaction demand. Protocols that can handle this scale will have a significant long-term structural advantage.

Finding the Economic Equilibrium for On-Chain Activity

  • James posits that not all activity needs to be on-chain. He argues that an economic equilibrium will naturally form based on the value and security requirements of a given transaction.
  • High-value, infrequent transfers justify the high security costs of a decentralized ledger.
  • Conversely, low-value, frequent actions (like IoT data or micropayments) may not be economically viable to record on-chain if the security cost outweighs the value.
  • The market will ultimately be defined by use cases where the cost of decentralized security is less than the value it provides. Identifying these pockets of demand, such as in high-value finance, is key.

The Ideal Base Layer for Central Limit Order Books (CLOBs)

  • The discussion shifts to the ideal environment for high-performance applications like CLOBs (Central Limit Order Books), which are order-book-style exchanges common in traditional finance. James frames the debate as a series of trade-offs, questioning what a system is truly optimizing for.
  • He contrasts a general-purpose L1 like Monad with a specialized app-chain like Hyperliquid, noting Monad was intentionally designed for broad composability, not just a single application.
  • The choice between a specialized chain offering latency advantages for market makers and a general-purpose L1 offering shared security and composability depends entirely on an application's specific needs and long-term goals.
  • James notes that in traditional finance, exchanges like CME use FPGAs (Field-Programmable Gate Arrays)—specialized hardware for ultra-low-latency tasks like timestamping orders—to ensure fairness. However, he argues a system doesn't need to reach this extreme to be successful.

What Defines a Quality Trading System?

  • Leveraging his HFT background, James explains that a quality trading system is not just about raw speed but a complex interplay of factors that create a fair and liquid market.
  • He points out that many profitable markets run on older, less optimized technology, proving that liquidity can be attracted through other means like rebates and market-making incentives.
  • A system's design can prioritize different things, such as favoring order cancellations over fills to protect market makers, which in turn encourages tighter spreads for retail users.
  • James Hunsaker: "You can still have a ton of demand. You can still have a ton of liquidity with a relatively poorly designed system. There's other ways sort of incentivizing... certain behavior."

The L1 vs. L2 Debate: Collocation, Decentralization, and User Experience

  • This section dives into the core architectural debate between L2s and decentralized L1s. L2s with single sequencers offer a centralized model where market makers can gain a latency advantage via sequencer collocation—placing their servers physically next to the sequencer.
  • James argues this creates a latency "race to the bottom" that benefits a few sophisticated firms but questions its broader market value.
  • Monad’s approach, with a globally distributed validator set, intentionally makes this race harder, forcing competition onto a more level playing field and prioritizing decentralization.
  • Strategic Implication: The choice between an L2 and a decentralized L1 like Monad is a bet on what will create more long-term value: isolated, hyper-optimized performance for a narrow set of users, or a globally accessible, composable ecosystem.

Latency vs. Throughput: What Really Matters?

  • James clarifies the often-confused performance metrics of throughput and latency.
  • Throughput (TPS) measures the total number of transactions a system can process over time. This is critical for handling "bursty" use cases with many simultaneous users, like sports betting.
  • Latency measures the time it takes for a single transaction to be confirmed. For retail users, latency below a few hundred milliseconds is imperceptible, but for market makers, it is critical.
  • He uses the analogy of a semi-truck full of hard drives: it has incredibly high throughput but also very high latency. A system's design must match the needs of its target use case.

Application Economics and The Future of Value Capture

  • The conversation addresses the value leakage from applications (like pump.fun on Solana) to validators via MEV. James advocates for a future with application-specific sequencing, where apps can define their own ordering rules.
  • This model would allow applications to internalize value or pass it directly to their users, creating a more aligned economic relationship between the app and the base layer.
  • He envisions a generalized on-chain framework where these "agreements" between apps and validators can be programmatically defined and enforced.
  • Actionable Insight for Researchers: The development of frameworks for app-specific sequencing is a critical area to watch. It represents a potential paradigm shift in how value is distributed in blockchain ecosystems, moving power from validators to applications.

The "Infra is Dead" Narrative and Developer Support

  • James dismisses the "infra is dead" narrative as user burnout from a wave of projects that offered only marginal improvements while raising huge sums. He asserts that meaningful infrastructure innovation is far from over.
  • He stresses the importance of deep, collaborative support between the core protocol team and application developers to optimize performance, a key lesson from his time at Jump Trading working on Solana.
  • This includes optimizing both on-chain and off-chain components, such as RPCs and data indexing, which are often bottlenecks for high-performance apps.
  • James Hunsaker: "I think people are burned out on infra just because a lot of infra hasn't been innovative and it's just like oh we got another one oh we got another one."

James Reacts: Unpacking Past Tweets

  • On Malicious Slashing: He questions the utility of automated slashing, arguing that most on-chain errors stem from incompetence (e.g., misconfiguration) or software bugs, not malicious intent. Slashing for a bug created by the core developers creates a paradox.
  • On the "IDGAF Chain": He satirizes chains that sacrifice decentralization and security for hype, urging for honesty about their intentions. "Just be honest... you're here to play with ponzies... come to my chain and play with ponzies and we'll ponzi together."
  • On Crypto for Payments: He debunks the simplistic argument that crypto saves 2-3% on payments, pointing out that credit cards offer valuable services like fraud protection and cash-back rewards, making it an "apples to oranges comparison."

Conclusion

This discussion reveals a fundamental choice for the crypto ecosystem: pursue hyper-optimized, centralized performance in siloed environments or build on decentralized, composable L1s that level the playing field. For investors, understanding where true, sustainable value will be created in this trade-off is the critical thesis to develop.

Others You May Like