Hash Rate pod - Bitcoin, AI, DePIN, DeFi
May 7, 2025

Hash Rate - Ep 108 - Targon - $TAO Subnet 4

This episode dives deep into Targon (BitTensor Subnet 4) with James Woodman, formerly of the OpenTensor Foundation and now spearheading Targon. The discussion centers on Targon's mission to build a decentralized AI inference and development platform, aiming to outmaneuver centralized giants.

Targon: Slashing AI Inference Costs

  • "It's almost like your AWS... you're competing with Amazon Web Services... by supplying the bandwidth and the compute and the preloaded model. I just push a button and I'm up and running."
  • "The cost of your product is significantly less for the end user than AWS... at least 85% cheaper."
  • Targon provides AI inference services, running open-source models like DeepSeek on a decentralized network of GPUs from Subnet 4 miners.
  • It offers a staggering 85% cost reduction compared to traditional cloud providers for similar AI compute, making powerful AI more accessible.
  • Users can pay with credit cards, with revenue (currently) supporting the platform's growth and aiming to eventually reduce reliance on TAO token subsidies.

The BitTensor Playbook: Subsidies to Sustainability

  • "I like the Uber analogy... BitTensor is a very novel way to bootstrap your business. You don't have to go raise a seed round... you have a built-in subsidy."
  • "We're really laser focused... on how do we go and take the compute that we have procured and add more value into the real world... than what the protocol is emitting."
  • BitTensor's TAO token emissions (currently $972 million annually) provide an initial subsidy, bootstrapping compute networks like Targon without massive upfront VC.
  • Targon’s long-term strategy involves building an "AI creator" marketplace, connecting global talent with its compute to foster novel AI models and applications.
  • The goal is a self-sustaining flywheel: real-world revenue from valuable AI services will eventually outweigh the initial TAO subsidies, creating a profitable, decentralized AI ecosystem. Dippy, a popular AI chatbot, is migrating to Targon for cost savings, a key validation.

Incentives & The Decentralized Edge

  • "If BitTensor has taught me one thing... 'Show me the incentive. I'll show you the outcome.'
  • "We have this composability of subnets using other subnets... you have these built-in business partners."
  • BitTensor’s architecture fosters deep composability, where subnets are incentivized to use each other’s services (e.g., Dippy, another subnet, using Targon), driving value across the entire TAO ecosystem.
  • Unlike centralized AI which relies on hiring all top talent internally, Targon and BitTensor aim to build a platform enabling brilliant minds globally, believing this decentralized approach has a higher chance of success.
  • Dynamic TAO allows token holders to vote on emission distributions, directing resources to promising subnets, creating a meritocratic system for innovation.

Key Takeaways:

  • Targon is leveraging BitTensor's unique tokenomics to offer radically cheaper AI inference, positioning itself as a formidable competitor to centralized AI. The focus is shifting from initial subsidies to building a sustainable, revenue-generating marketplace that attracts global AI talent.
  • Decentralized Disruption: Targon offers AI inference at an 85% discount to AWS, powered by BitTensor's TAO-subsidized distributed compute network.
  • Sustainable AI: The mission is to transcend subsidies by creating an "AI creator" marketplace, funneling real-world revenue (Stripe payments) back into the ecosystem.
  • Incentive Alignment Wins: BitTensor's composable subnets and dynamic TAO voting create a powerful, self-reinforcing ecosystem driving innovation and value back to TAO.

For further insights and detailed discussions, watch the podcast: Link

This episode unpacks Targon's (BitTensor Subnet 4) strategy to disrupt centralized AI by leveraging decentralized compute for inference, aiming to build a sustainable, community-driven AI ecosystem.

BitTensor Endgame Conference Reflections

  • Mark Jeffrey and James Woodman open by discussing their experiences at "BitTensor Endgame," the first official conference for the BitTensor ecosystem.
  • They emphasize the high-signal environment and palpable energy, noting the diverse attendance which included long-standing community members, newer participants like James himself, and representatives from Sand Hill Road venture capital firms such as Gumi Cryptos.
  • James highlights the unique value of in-person interactions, contrasting them with daily online collaborations.
    • Mark Jeffrey: "It felt to me like the very early Bitcoin conferences or even the very early internet conferences... where there was like, you know, a couple hundred people maybe and anyone who was not there thought we were all insane."
  • The conference was perceived by many as a pivotal moment, potentially marking the formal congregation of a movement that could reshape AI development.
  • Strategic Implication: The enthusiastic community engagement and emerging institutional interest observed at the conference suggest growing momentum for the BitTensor ecosystem, a critical development for investors and researchers to monitor.

James Woodman's Journey into BitTensor

  • James Woodman shares his unconventional path from a traditional M&A role in New York City to the burgeoning world of cryptocurrency.
  • His initial exposure to crypto came while covering fintech companies, which eventually led him to join GSR, a specialized digital asset liquidity provider and trading firm.
    • GSR: A significant market-making and investment entity in the digital asset sector, known for enhancing liquidity for various crypto projects.
  • In late 2022, James discovered BitTensor. He was profoundly impacted by an interview with its founders, Jake Yocom-Piatt and Ala Shaabana, and the project's foundational whitepaper, despite the limited information available at the time.
    • James Woodman: "I watched this interview of Jake and Ala. It had a thousand views and I thought it was the best interview I'd ever heard in my whole life."
  • He recounts the early, somewhat precarious method of acquiring TAO (BitTensor's native token) through an illiquid platform named Tensor Exchange, which involved sending Bitcoin and awaiting the TAO tokens in return.
    • TAO: The native cryptocurrency of the BitTensor network, integral for staking, governance, and incentivizing network participation.
    • Tensor Exchange: An early, rudimentary over-the-counter style platform for trading TAO, underscoring the project's nascent stage.
  • An incident where his Bitcoin transaction was temporarily stuck, but then resolved by community assistance, further solidified his belief in the project's unique potential and strong community.
  • Investor Insight: The early, often "janky," stages of groundbreaking projects can obscure immense opportunity. James's experience shows that identifying visionary teams and resilient communities amidst such initial roughness can be key to early involvement in transformative technologies.

From Liquidity Provision to OpenTensor Foundation

  • Identifying BitTensor's critical need for enhanced liquidity, James, during his tenure at GSR, took the initiative to provide solutions.
  • After reaching out to several community members, he connected via Discord and proposed liquidity support, leading to GSR's involvement in one of the first TAO token investment and liquidity provisioning deals in March 2023. This facilitated listings on exchanges like MEXC.
    • MEXC: A centralized cryptocurrency exchange known for listing a diverse array of tokens, often providing earlier access than larger platforms.
  • James describes the significant hurdles in getting TAO listed, even on what he termed "Farm League" exchanges, due to its price volatility.
    • James Woodman: "It's gone from 80... down to 35. And he said, 'This thing's dead beat. You really want to touch it?' And I said, 'We have to touch it.'
  • A crucial meeting at Consensus 2023 with Rob, the first member of the BitTensor Discord and a prominent miner (now James's partner at Manifold), left a lasting impression on James.
  • This connection led to James relocating from New York to Austin to join the OpenTensor Foundation (the non-profit supporting BitTensor) as COO, working closely with Rob. Manifold, the entity behind Targon (Subnet 4), was subsequently established in December 2023.
    • OpenTensor Foundation: The organization dedicated to fostering the development, growth, and adoption of the BitTensor ecosystem.
  • Strategic Consideration: James's progression from identifying a project's core need (liquidity) to becoming deeply integrated into its foundational operations illustrates a potent pathway for impactful contribution and strategic investment in emerging crypto AI networks.

Introducing Targon (BitTensor Subnet 4)

  • James explains that Targon.com, the product of Manifold, currently focuses on performing inference.
    • Inference (AI): The process where a trained machine learning model uses new input data to make predictions or generate outputs. For example, feeding an image to a model to classify it, or giving a text prompt to an LLM to generate a story.
  • He points out a fundamental challenge in the current BitTensor incentive model: "For every dollar that a minor receives, this is funded by $1 in speculation, which you don't need to be a genius to know that is unsustainable."
  • Targon's mission is to introduce organic revenue streams. This creates a self-reinforcing cycle: miners contribute compute power; Targon utilizes this compute to serve existing AI models (such as DeepSeek AI, an open-source Chinese model) and, in the future, to establish a marketplace for AI model development.
    • DeepSeek AI: A family of open-source large language models developed in China, recognized for their competitive performance.
  • The long-term vision for Targon is to empower global talent—including individual researchers, academic institutions, and corporations—by providing access to its compute resources. This will enable them to build, deploy, and monetize state-of-the-art AI models, fine-tuned versions, and applications.
  • Actionable Insight for Researchers: Targon's planned compute marketplace represents a significant opportunity for AI researchers who lack access to costly GPU infrastructure, offering a platform to contribute to and benefit from a decentralized AI ecosystem.

Targon's Competitive Edge and Sustainable Model

  • Mark Jeffrey positions Targon as a competitor to established cloud services like Amazon Web Services (AWS), particularly in AI. Targon offers access to preloaded models on a decentralized GPU infrastructure, reportedly at a cost up to 85% lower than traditional providers.
    • AWS (Amazon Web Services): A leading cloud computing platform offering a vast array of services, including GPU instances for AI model training and inference, typically at premium pricing.
  • James acknowledges that the current subsidy from TAO token emissions serves as an effective "bootstrap" mechanism. However, he stresses the imperative to transition towards a sustainable model by reducing dependence on speculative funding.
  • He references Lambda Labs, a GPU cloud provider, which recently secured a $500 million debt facility at a 15% interest rate. This figure indicates the prevailing market cost for data centers to acquire AI chips. Targon aims to offer its compute providers a competitive return (e.g., 16-17% "coupon") to cover such financing costs.
    • James Woodman: "How do we return 16-17%... a coupon to those compute providers and then they can pay off their debt."
  • The long-term strategy involves lowering the cost of capital for compute providers by distributing risk across a decentralized network, contrasting with the concentrated counterparty risk associated with lending to a single entity like Lambda.
  • Investor Takeaway: Targon's strategic shift from a token-subsidized model to one generating organic revenue and offering attractive returns to compute providers is vital for its long-term viability and appeal as an investment.

The "Uber for GPUs" Analogy and Path to Profitability

  • Mark Jeffrey draws an analogy between BitTensor's token subsidy model and Uber's early strategy, where venture capital subsidized rides to capture market share before achieving profitability.
  • James concurs, describing BitTensor as a "novel way to bootstrap your business" that circumvents traditional seed funding by leveraging a built-in subsidy to achieve product-market fit prior to scaling.
  • He underscores Manifold's commitment to building a profitable enterprise, a pragmatic focus he suggests is sometimes overlooked in the crypto sphere.
    • James Woodman: "Isn't that a wild concept? I mean, yeah, we're a business that is thinking about making money."
  • The BitTensor network currently emits TAO tokens valued at approximately $972 million annually. James emphasizes the responsibility to generate real-world economic value exceeding this emission rate to ensure sustainability.
  • Strategic Implication: Investors should critically assess projects heavily reliant on token emissions, prioritizing those with clear pathways to organic revenue and profitability, a direction Targon is actively pursuing. The capacity to generate value surpassing emissions is a crucial indicator of long-term health.

Empowering the "AI Creator" and Coordinating Talent

  • Targon's ambitions extend beyond merely selling subsidized inference on pre-existing models. The company aims to cultivate an "AI creator" economy, mirroring how platforms like YouTube and TikTok utilized creator funds to catalyze their growth.
  • The core idea is to furnish compute resources to talented individuals and researchers globally who are currently hampered by limited access to expensive hardware.
    • James Woodman shared an illustrative example: "He was working on this training run and he said, 'Hey, I think I have something really special here. If someone could lend me an A100 8 box, I'll pay you back. I promise the results will be huge.'
  • By alleviating this compute constraint, Targon anticipates that these creators will develop innovative models. These models, served within the Targon ecosystem, would then attract application developers and end-users, creating a vibrant marketplace.
  • James distinguishes Manifold's approach from that of centralized AI labs like OpenAI. While both acknowledge the necessity of top-tier talent and models, Manifold posits that success is more probable by constructing a platform that enables global talent, rather than attempting to consolidate all talent within a single corporate structure.
  • Researcher Opportunity: Targon's platform is designed to democratize access to compute, providing researchers an avenue to develop, deploy, and monetize their AI innovations without the prerequisite of joining large corporations or securing substantial private funding.

Decentralization as a Strength and Market Traction

  • Mark Jeffrey draws a compelling parallel to Satoshi Nakamoto's decentralized design for Bitcoin, which has generated trillions in value, contrasting it with the failures of earlier, centralized attempts at digital cash.
  • James concurs, highlighting that BitTensor aims to coordinate not just "hash power" (compute resources) but also "brains" (human talent and innovation).
  • A tweet from JJ, a venture capitalist active in the BitTensor ecosystem, is referenced. It contrasts ChatGPT (60 trillion AI tokens processed per month, built on $70B raised, valued at $30B, and incurring significant losses) with BitTensor (2 trillion AI tokens/month, achieved in under 6 months, subsidized by TAO, and collectively owned).
    • This comparison underscores BitTensor's rapid scaling and capital efficiency relative to centralized AI behemoths.
  • James shares an anecdote from a compute provider who estimated that BitTensor now consumes "easily over 20-30%" of the available on-demand GPU compute capacity, signaling substantial market penetration.
    • James Woodman: "Bit Tensor can consume a lot of the other resources in the world that are outside of OpenAI, Microsoft, the big guys. And together we can build a big player ourselves."
  • Investor Insight: BitTensor's demonstrated ability to rapidly scale its consumption of compute resources and capture a significant share of the on-demand GPU market showcases the efficacy of its decentralized incentive model. This is a key differentiator from the capital-intensive development models of centralized AI.

Mining on Targon and Enterprise-Grade Service

  • For individuals or entities looking to become miners on the Targon network, the process is designed for simplicity: they supply their GPU compute by installing Targon's software and, in return, earn Targon's subnet tokens. Miners are not required to manage or load AI models themselves; Targon's infrastructure handles these complexities.
  • The AI models operate on the aggregated GPU power of the network. More popular or computationally intensive models are allocated a larger share of these resources to ensure optimal performance and speed.
  • When questioned about enterprise-readiness, James asserts that Targon compares "phenomenally well" with established services like AWS, particularly due to its implementation of a Trusted Execution Environment (TEE), which Targon refers to as the Targon Virtual Machine.
    • Trusted Execution Environment (TEE): A secure, isolated area within a computer's main processor. It guarantees that code and data loaded within it are protected in terms of confidentiality and integrity, preventing unauthorized access or modification, even from the system's host operating system.
  • This TEE architecture provides security by design, offering a potential advantage over traditional cloud providers, especially when dealing with data-sensitive industries such as healthcare. Targon's model means clients do not need to rely on contractual "trust" that their data will not be misused.
    • James Woodman: "With Targon, I actually think we might be in a better position relative to an Amazon when going, you know, to a healthcare company where the most important thing is actually the data."
  • Actionable Implication for Enterprises: Targon's TEE offers a robust security proposition for enterprises seeking to leverage AI capabilities without compromising data privacy or control. This could unlock new AI use cases in highly regulated sectors.

The Dippy Case Study and Subnet Composability

  • Dippy, a popular AI chatbot application with considerable user adoption, is currently transitioning its inference workload from Together AI to Targon.
    • Together AI: A cloud platform that provides access to open-source AI models and GPU compute resources, representing a type of competitor that Targon aims to outperform on cost.
  • The primary motivation for Dippy's migration is significant cost reduction, reflecting the broader trend where "inference is going to zero" in terms of cost.
  • Notably, Dippy itself operates as another subnet within the BitTensor ecosystem. This exemplifies the concept of subnet composability, where different specialized subnets can utilize each other's services.
    • James Woodman: "We have this composability of subnets using other subnets, which is kind of an interesting dynamic, right? It's like deep composability."
  • This interconnectedness fosters a collaborative environment. Subnets are inherently incentivized to support one another because collective success enhances the value of the underlying TAO token, benefiting all network participants.
    • Mark Jeffrey observed: "The more we help out... another subnet... the more we're driving the value of the basecoin TAO up."
  • Strategic Insight: The composability within the BitTensor ecosystem generates powerful network effects and creates organic business development opportunities. This fosters a mutually beneficial environment for subnets like Targon and Dippy, strengthening the overall network.

Payment, Revenue, and Dynamic TAO

  • Targon facilitates user payments through conventional credit cards, processed via Stripe, accommodating the preference for familiar payment mechanisms.
    • James Woodman: "It is whatever the user needs and the user happens to like old-fashioned credit cards."
  • This emphasis on generating tangible revenue is a cornerstone of Targon's strategy for sustainability.
  • James elaborates on Dynamic TAO, a governance mechanism within BitTensor that empowers TAO holders to vote on the allocation of newly minted TAO token emissions to different subnets.
    • Dynamic TAO: A feature enabling TAO token holders to direct a portion of the network's inflation towards subnets they deem valuable. This creates a market-based system for rewarding subnets that demonstrate utility, growth, and contribution to the ecosystem.
  • This system allows the community to channel more resources (via emissions) to promising subnets like Targon, thereby enabling them to secure more compute power and enhance their competitive posture against centralized AI entities.
  • Targon's future roadmap includes utilizing the revenue generated from its services to buy back its own dynamic subnet tokens (often called alpha tokens), ideally through a transparent smart contract. This would create a direct economic link between service revenue and token value.
  • Investor Consideration: Dynamic TAO introduces a decentralized, market-driven approach to resource allocation within the BitTensor network. Subnets like Targon, which can demonstrate real-world utility and generate revenue, are well-positioned to attract increased emissions and, consequently, further investment and growth.

Go-to-Market Strategy and Product Focus

  • Targon's go-to-market strategy is not a broad-stroke attempt to capture all users from competitors like Together AI. Instead, it is a focused effort targeting a specific client archetype, epitomized by Dippy.
  • This "Dippy profile" characterizes companies that are experiencing rapid growth, are sensitive to operational costs (often not heavily venture-backed), and generate a substantial volume of AI token inference monthly.
    • James Woodman: "Dippy is in many ways the perfect client. They're rapidly growing. They're sensitive to cost... And they're doing a good amount of tokens a month."
  • The strategic plan involves ensuring Dippy's success on Targon, developing this into a compelling case study, and then leveraging this to attract similar clients. The core selling proposition is the significant (up to 85%) cost reduction for what is becoming commodity AI inference.
  • James also highlights the critical importance of user interface (UI) and overall productization. He notes these aspects are frequently neglected in the crypto space but are essential for competing effectively against polished offerings from major players like OpenAI.
  • Actionable Insight: Targon's targeted go-to-market strategy, which concentrates on a customer segment where its cost advantage is most pronounced, coupled with a strong commitment to a high-quality product experience, significantly enhances its likelihood of gaining market traction and scaling efficiently.

Mainstream Recognition and The Future of Work

  • Mark Jeffrey recounts his experience discussing BitTensor and Targon with Jason Calacanis on the influential podcast "This Week in Startups." He notes Calacanis's genuine curiosity, which was surprising given his historical skepticism towards crypto.
  • James shares the Targon team's excitement during this mention, and also recalls a similar moment when Chamath Palihapitiya spoke about BitTensor approximately a year earlier.
  • James reiterates the transformative potential of rethinking incentive structures and collaborative work models, a paradigm shift pioneered by Bitcoin and now being applied by BitTensor to the field of artificial intelligence.
    • James Woodman: "We can apply it here and by the way, we can get 85% cost savings on the product. I mean, oh my god, right? It just feels hard to beat."
  • He emphasizes that while the initial subsidy model is instrumental for gaining early traction and achieving product-market fit, the ultimate objective is to contribute tangible, productive value to the broader economy and thereby accelerate Targon's self-sustaining flywheel.
  • Researcher/Developer Call to Action: James extends an open invitation to talented individuals to engage with BitTensor and Targon (accessible via targon.com). He states that Targon is actively recruiting and seeking skilled contributors for what he views as a paramount global mission: ensuring that the future of AI is collectively owned and governed by the many, rather than controlled by a select few.

Conclusion

This episode reveals Targon's ambitious plan to build a decentralized AI powerhouse by offering significantly cheaper inference and fostering an open ecosystem for AI creators. For investors and researchers, Targon's focus on sustainable revenue, enterprise-grade security, and talent coordination signals a potent alternative to centralized AI, warranting close observation.

Others You May Like