Opentensor Foundation
August 8, 2025

Novelty Search :: Bittensor Subnet 14 :: TaoHash

This is a deep dive into the evolution of TaoHash (Subnet 14), a decentralized Bitcoin mining pool, and the ambitious launch of Subnet 5, a collaboration aimed at cracking AGI benchmarks with hierarchical AI. Features insights from the teams at Litton Holdings and Manifold.

TaoHash: Decentralizing Bitcoin Mining

  • "56.8% of the Bitcoin network hash rate is controlled by just three pools. This is a big problem... We'd like to solve that."
  • "We trusted validators way too much to return value. We've removed trust where possible and we will become further trustless over time."
  • TaoHash is on a mission to decentralize Bitcoin mining, but its first design had a fatal flaw. By trying to fully subsidize mining with its native token (Alpha), it created constant downward price pressure. The original model trusted validators to return mined Bitcoin to the subnet, but they often kept it, breaking the economic loop. The redesign fixes this by operating like a traditional mining pool. Miners now receive their Bitcoin directly, with a low 2% pool fee, and earn Alpha as an additional subsidy. This new, more sustainable model allows TaoHash to scale its incentivized hashrate to potentially 7% of the entire Bitcoin network.

A New Path to AGI

  • "Our mission is to pioneer a new path to AGI by harnessing hierarchical learning and reasoning... We believe that achieving human-level intelligence will not come from sheer scale alone, but from architectures that learn and think in levels much like the human brain."
  • Subnet 5, a collaboration between Litton Holdings and Manifold, is a bet against the "bigger is better" AI training meta. The project argues that current auto-regressive models are sample-inefficient and brute-force their way to intelligence. Instead, Subnet 5 will focus on hierarchical models, like Yann LeCun's JEPA, which can form abstract representations and create entire action plans in a single forward pass. The goal is to create smaller, more efficient models that demonstrate superior reasoning and planning—a step-function improvement over today's LLMs.

The ARC-AGI-2 Gauntlet

  • "We're going to be contributing to open source AI by solving the ARC-AGI-2 benchmark, which no one has solved yet."
  • Subnet 5 isn’t just hand-waving about AGI; it has a clear, measurable, and brutally difficult target: solving the ARC-AGI-2 benchmark. Current state-of-the-art models stagnate at around 5% accuracy on this test of abstract reasoning. However, a recent 27-million-parameter hierarchical model hit 6% accuracy, proving that smaller, smarter architectures can outperform behemoths. By focusing on this specific challenge, the subnet creates an "AI model factory" to rapidly iterate on small models (1M-10B parameters) in a cost-effective way that only Bittensor's incentive structure can enable.

Key Takeaways:

  • The podcast highlights two crucial trends: the maturation of subnet economics and a strategic shift towards novel AI architectures. TaoHash’s pivot demonstrates that sustainable subnets must align with, not fight against, the economics of the market they serve. Meanwhile, Subnet 5’s mission suggests Bittensor’s future isn’t just about competing on scale, but about pioneering fundamentally new and more efficient paths to intelligence.
  • Sustainable Economics Win: TaoHash's initial model failed because it tried to use an inefficient token subsidy to capture a hyper-efficient market (Bitcoin mining). The successful pivot was to act like a standard pool and use its token as a value-add subsidy, not a revenue replacement.
  • Architecture Over Brute Force: Subnet 5 is a bet that the next leap in AI will come from architectural innovation, not just throwing more parameters at the problem. By focusing on hierarchical models, it aims to build smaller, smarter systems that can out-reason massive LLMs on complex tasks.
  • Benchmarks Ground Innovation: A clear, difficult, and measurable goal like solving ARC-AGI-2 focuses the network's energy. It transforms a vague mission ("build AGI") into a concrete engineering problem, allowing for rapid, cost-effective iteration and a clear definition of success.

For further insights and detailed discussions, watch the full podcast: Link

This episode reveals how Bittensor's top builders are pivoting from early-stage incentive models to sophisticated, sustainable subnet economies, tackling everything from Bitcoin mining decentralization to the next frontier of AGI research.

Introduction to the Team and Latent Holdings

The episode brings together key figures who have been instrumental in Bittensor's development. Jake, the host, introduces Cameron (aka Vune), a core developer who has maintained critical infrastructure like the bittensor API, and Abe, another long-term contributor. Both are now central to Latent Holdings, a team that took on the responsibility of maintaining core Bittensor tooling after the OpenTensor Foundation began decentralizing its operations.

Joseph, representing Latent Holdings, outlines the team's mission: to make Bittensor more accessible, stable, and understandable. Their work is foundational to the ecosystem's health.

  • Core Contributions: Latent Holdings maintains the Bittensor SDK, CLI, developer documentation, and other open-source tools. They are the largest open-source contributor to Bittensor outside the foundation.
  • Key Projects: The team is behind community-facing projects like Tao App (an analytics explorer) and Learn Bittensor (an educational resource).
  • Impressive Velocity: In just six months, the small team has produced over 100 releases, nearly 4,000 code commits, and changed over 173,000 lines of code, demonstrating immense dedication and efficiency.

TaoHash (Subnet 14): Decentralizing Bitcoin Mining

Abe introduces TaoHash (Subnet 14), a project designed to challenge the centralization in Bitcoin mining. He highlights a critical vulnerability in the Bitcoin network: just three mining pools control a majority (56.8%) of the network's hash rate. TaoHash aims to solve this by creating an openly owned and accessible Bitcoin mining pool.

  • Hash Rate: This is the total combined computational power being used to mine and process transactions on a proof-of-work blockchain like Bitcoin. Higher hash rate means greater network security.
  • How it Works: Miners direct their hash rate to the TaoHash pool. The pool contributes this power to the Bitcoin network, earns Bitcoin rewards, and passes them back to the miners. The subnet's native token, Alpha, is used to provide an additional incentive layer.
  • Value Proposition: TaoHash offers miners one of the lowest pool fees on the market (around 2%), lightning-fast payouts, and full transparency with open-source code.

Abe states the core problem simply: "56.8% of the Bitcoin network hash rate is controlled by just three pools. This is a big problem and it's not good for the Bitcoin network."

The Evolution of TaoHash: Learning from Past Mistakes

The conversation turns to the "drama" surrounding TaoHash's initial design. Abe provides a candid post-mortem on the original model and the critical lessons learned, offering a valuable case study in subnet economics for investors and researchers.

The old design had several fundamental flaws:

  • Scalability Issues: The subnet's ability to incentivize hash rate was directly capped by the price of its Alpha token. A drop in the token price would immediately reduce the hash rate the subnet could support, creating a risk of a "downward spiral."
  • Misaligned Validator Incentives: The original model trusted validators to sell the mined Bitcoin and use the proceeds to buy back Alpha, driving value back to the subnet. Instead, many validators simply kept the Bitcoin, breaking the economic loop.
  • Inefficient Buybacks: The buyback mechanism was susceptible to MEV (Maximal Extractable Value) and failed to distribute value evenly to all token holders.
  • Currency Risk: The subnet was exposed to volatility in the TAO-to-Bitcoin conversion rate.

The Redesigned TaoHash: A More Sustainable Model

The team redesigned TaoHash to function more like a traditional mining pool, but with a crypto-economic twist. This new model is more robust and sustainable.

  • Direct Bitcoin Payouts: Instead of trying to capture 100% of the mining rewards, TaoHash now passes the majority of the mined Bitcoin directly back to the miners, minus a small pool fee.
  • Alpha as a Subsidy: The Alpha token now acts as a subsidy, effectively lowering the net pool fee for miners to ~1.5-1.75%, making it highly competitive. This removes the direct dependency on the Alpha price for incentivizing hash rate.
  • Strategic Implication: This pivot demonstrates a maturing understanding of subnet economics. Investors should favor models that provide direct, tangible value (like BTC payouts) and use native tokens for subsidies or governance, rather than as the sole, volatile reward mechanism.

Future Roadmap: A Trustless Hash Rate Marketplace

TaoHash's ambition extends beyond being just a mining pool. The team is building an EVM-based, decentralized hash rate marketplace, inspired by services like NiceHash.

  • Hash Rate Marketplace: This will allow the subnet's hash rate to be sold trustlessly to the highest bidder. This financializes hash rate, turning it into a tradable commodity.
  • Benefits: This creates a premium on the hash rate, providing faster payouts for miners and higher returns for Alpha token holders who govern the market. It also removes the team as a trusted intermediary.
  • Decentralized Infrastructure: The long-term vision includes decentralizing the pool infrastructure itself, likely having validators host endpoints to create a fault-tolerant, distributed system.

Subnet 5 (Hone): A New Frontier in Hierarchical AI

The discussion shifts to a new collaboration between Latent Holdings and Manifold on Subnet 5, codenamed "Hone." Rob from Manifold passionately outlines an ambitious mission: to achieve a breakthrough in Artificial General Intelligence (AGI) by focusing on novel architectures rather than just scaling up existing models.

  • The Goal: The project's first concrete objective is to solve the ARC-AGI-2 benchmark, a notoriously difficult test for current AI models, where even state-of-the-art systems like GPT-4 stagnate at around 5% accuracy.
  • The Approach: The subnet will focus on training hierarchical models, inspired by concepts like Yann LeCun's JEPA (Joint-Embedding Predictive Architecture). These models learn abstract representations of the world, enabling them to reason and plan in a single forward pass, unlike the thousands of passes required by current auto-regressive models.
  • Key Insight: This approach promises extreme sample efficiency. A recent paper showed a small, 27-million-parameter hierarchical model achieved 6% on ARC-AGI-2 with only 1,000 training examples—outperforming models thousands of times larger.

Rob frames the mission with urgency and clarity: "We're going to be contributing to open source AI by solving the ARC-AGI-2 benchmark, which no one has solved yet."

Mechanics and Strategic Implications of Subnet 5

The subnet will function as a distributed training network, but with a unique focus.

  • Smaller, Efficient Models: Unlike other training subnets aiming for massive parameter counts, Subnet 5 will train smaller models (starting in the 50-100 million parameter range). This makes participation accessible to researchers with consumer-grade hardware (e.g., a single 4090).
  • Rapid, Cost-Effective Iteration: The combination of small models, high sample efficiency, and a clear benchmark allows for rapid, low-cost innovation. The goal is to find a solution that is not only intelligent but also computationally cheap, a requirement of the ARC-AGI-2 prize.
  • Actionable Insight for Researchers: This project signals a shift away from the "scale is all you need" paradigm. Researchers should watch Subnet 5's progress closely, as a breakthrough would validate the potential of hierarchical architectures and open new avenues for AGI research beyond brute-force scaling.
  • Timeline: Rob commits to a public timeline of "two weeks" for launching the initial testnet, signaling a fast-moving and highly focused effort.

Conclusion

This episode highlights the maturation of the Bittensor ecosystem, showcasing a move towards sustainable tokenomics with TaoHash and a pivot to cutting-edge, targeted research with Subnet 5. For investors and researchers, the key takeaway is to prioritize subnets with robust economic designs and novel technical approaches that promise step-function progress over incremental gains.

Others You May Like