This episode reveals how AI's true potential is unlocked not by generic models, but by grounding it in the most personal and proprietary dataset of all: human biology.
The Founders' Journey: System Failure as the Catalyst for Innovation
- Jonathan Swirlin (Function Health): Motivated by his father's cancer diagnosis and his own frustrating, expensive journey to understand his health, Jonathan co-founded Function to give people control over their biological data. He emphasizes that the traditional medical system is a poor product for creating health.
- Daniel (Slingshot AI / Ash): Coming from a family of mental health professionals and with a background in AI research, Daniel saw an opportunity to address the massive care gap in mental health. He notes that for every one therapist, there are over 10,000 people, and "54% of those with a mental illness... don't have any care whatsoever."
- Strategic Insight: Both founders bypassed traditional, regulated systems (like insurance payers) to go directly to consumers. This direct-to-consumer (D2C) approach, while challenging, allows for faster iteration and a product experience built entirely around user needs, a strategy that resonates with the decentralized, user-centric ethos of many crypto projects.
Redefining User Experience for Human Biology
- The speakers agree that simply replicating existing human-led processes with AI is a failure of imagination. Jonathan powerfully illustrates this point: "To look backwards, even call an AI doctor or an AI therapist is like saying... it's the motorized horse."
- This highlights a core challenge for AI researchers: building models that can understand and integrate complex, non-digital human context. For investors, this signals an opportunity in companies that are not just digitizing old workflows but creating entirely new interaction paradigms based on unique data sources.
Momentum as a Moat: The Power of Rapid Iteration and Trust
- Daniel's Perspective: For Ash, momentum comes from building trust. Users find therapists through referrals, and similarly, they adopt Ash after hearing stories of its impact on others' lives.
- Jonathan's Perspective: For Function, which will have run nearly a billion lab tests within a year, momentum is about "getting the reps fast." This rapid iteration cycle allows their team to learn and improve the product at a pace impossible in traditional healthcare.
- Actionable Insight: Jonathan cautions that momentum must be in the "right direction" and not mask underlying problems. For investors, this is a critical due diligence point: is growth a result of a strong product-market fit, or is it a vanity metric covering up poor retention or a flawed model?
Rethinking Engagement: Beyond Daily Active Use
- Function Health's Vision: Jonathan envisions a future of "hourly active use," where personal health data becomes the foundational context for all personal AI interactions. He argues, "We have to literally bring your biology online," making health the integrated center of a user's digital life, not a siloed app.
- Ash's Model: Daniel compares Ash to a mattress—"not everyone needs therapy all the time, but everyone needs therapy at some point." Engagement is intense and purposeful when a user is in need, focused on building a therapeutic alliance—a collaborative, trusting relationship between a client and therapist (or AI) aimed at achieving goals. The product is "built to be deleted" once the user's immediate needs are met, fostering long-term trust.
The AI Differentiator: From Raw Data to Actionable Insight
- The host admits, "I don't understand biomarkers... All I want is am I healthy? Can I get better? And what do I do?" This perfectly frames the problem AI solves.
- Daniel contrasts Ash with generic chatbots. While ChatGPT might offer generic validation, Ash engages users with challenging, Socratic questions ("Why is anger a bad thing?") to provoke deeper self-reflection and insight. This demonstrates the power of a specialized foundation model—a large-scale AI model trained on a vast quantity of data that can be adapted for specific tasks, in this case, for psychology.
Augmenting, Not Replacing, Human Expertise
- Jonathan explains that when a patient sees a doctor armed with Function's data, the conversation is no longer vague. The doctor has a clear signal, allowing them to "finally execute on" their advanced training.
- Daniel notes that many therapists refer their patients to Ash and even use it themselves, seeing it as a tool that provides continuous support and helps people address issues before they escalate. This collaborative dynamic is a key trend for researchers and investors to watch.
The Ethics of AI in Health: Autonomy vs. Intervention
- Daniel shares a powerful insight from a conversation with therapist and author Lori Gottlieb: sometimes, it is better for a person to remain in a painful situation to learn the lesson for themselves rather than having an AI (or therapist) provide a short-term solution.
- The AI must be designed to guide and challenge, not to make decisions for the user. This preserves the user's sense of agency and humanity. "What builds an alliance is being challenged. It's finding a new way to look at your problem."
- Strategic Implication: For Crypto AI researchers, this highlights the immense challenge and importance of building models that can navigate complex ethical gray areas. For investors, a company's ethical framework and approach to user autonomy is a critical, long-term risk factor.
Conclusion
This discussion underscores that the next frontier of AI is built on proprietary, deeply personal data like biology. For Crypto AI investors, the key takeaway is to identify projects that are not just building models, but are creating new ecosystems of trust and data sovereignty that give users control while unlocking unprecedented value.