This episode features Augusto "Aghi" Marietti, founder and CEO of Kong, and a16z's Martin Casado, who unpack one of Silicon Valley's most remarkable startup stories—from near-death experiences to building the definitive API infrastructure for the AI era.
The Seven-Year Hustle
Kong’s Breakout Moment
APIs as the Language of AI
Key Takeaways:
For further insights, watch the episode here: Link
This episode reveals the brutal, decade-long journey of Kong's founder from near-failure to API dominance, offering a direct playbook for how foundational infrastructure will capture value in the coming AI agent economy.
The Early Hustle: Arriving in the US with $600
AI, the founder and CEO of Kong (formerly Mashape), recounts his arrival in San Francisco from Italy with his co-founder. They landed on a 90-day tourist visa with only $600, facing a stark "make it or break it" reality. Their mission was to secure funding before their time and money ran out, knowing failure meant returning to Italy with nothing. This initial phase was defined by extreme resource constraints and immense pressure, setting the stage for a culture of intense struggle and resilience.
Securing the First Check: The Stanford Mixer and Travis Kalanick's Kitchen
Desperate for connections, AI and his co-founder attended a Stanford entrepreneurship mixer. After arriving late, they took the paper registration list containing the emails of 400 VCs and entrepreneurs. AI spent the entire night personally emailing every contact, which led to a handful of meetings. This hustle resulted in a meeting with Kevin Donahoe, an early YouTube team member, who wrote their first check for $17,000. The final negotiation for their initial $51,000 angel round took place in Travis Kalanick's kitchen, with the Uber founder coaching the young and naive AI through the process.
Surviving on a Shoestring: The Tuna Pasta Era
After securing the initial funds, the founders returned to Italy to get proper visas before coming back to the US. Without social security numbers or credit scores, they couldn't pay themselves a salary. The company issued them a $1,000 monthly promissory note, which had to support all three co-founders in San Francisco. They survived for over a year by living on a single mattress and eating a diet of rice, beans, tuna, and pasta—the cheapest combination of carbs and protein they could find. This period of extreme frugality forged a deep-seated resilience within the founding team.
The Pivot to an API Marketplace
The initial product was a drag-and-drop application builder using APIs, a concept AI now admits was about a decade too early. Realizing the original idea was failing and with money draining, the team took a trip to Honolulu to rethink their strategy. They identified the emerging API economy as the core trend and pivoted to building an API marketplace, a platform for API producers and consumers to connect. They launched quickly, gaining press coverage from TechCrunch and other outlets, which generated enough traction to begin raising a proper seed round.
Raising the Seed Round: Landing Bezos and Schmidt
The pivot to an API marketplace attracted serious investors. The round was led by NEA and Index Ventures, with an initial commitment from CRV. AI strategically secured investments from Jeff Bezos's and Eric Schmidt's personal funds. He got an introduction to Bezos's family office by hiring their lawyer, correctly betting on Bezos's interest in marketplaces and APIs. The investment from Schmidt's fund came serendipitously after another startup in their co-working space, backed by Schmidt, recommended them as the "hardest workers" in the building. This $1.5 million seed round provided the capital to scale the team.
The Series A and the Marketplace's Failure
Despite raising a $6.5 million Series A led by CRV, the API marketplace model struggled to find a viable business model. AI explains that successful marketplaces require a long tail of low-power suppliers (like Airbnb hosts), but the API economy was dominated by a few powerful players like Stripe and Twilio. Furthermore, a lack of exclusivity, quality control, and poor unit economics meant the model could never scale profitably. The business stagnated, burning through cash with little to show for it.
The Birth of Kong: The Open-Source Pivot from the Brink of Death
Facing imminent failure, the team identified their most valuable asset: the powerful internal engine built to run their marketplace. This engine, an API Gateway (a system that manages API traffic, handling tasks like authentication, rate limiting, and security), was rebuilt three times and had become incredibly robust. They made a critical decision to open-source this technology, rebranding it as Kong. This pivot was a last-ditch effort, funded by a $2 million bridge round from existing investors as they were completely "out of gas" and weeks from shutting down.
From Starvation to Hyper-Growth: Finding Product-Market Fit
The open-sourcing of Kong in 2015 was an immediate success, finding explosive product-market fit. The company, which had struggled for seven years, suddenly experienced rapid growth. This period of starvation is now memorialized in the company's "Founders Award," where the best employee receives 2,555 shares of stock—symbolizing the 2,555 days (seven years) of struggle. The market timing was perfect, as the rise of cloud computing and microservices (an architecture where applications are broken into small, independent services) created a massive need for API infrastructure.
Navigating the AI Shift: The Future of API Infrastructure
AI views the current shift to AI as another major tailwind for Kong. He argues that AI agents will consume the internet programmatically through APIs, not through traditional user interfaces. This machine-to-machine communication represents a fundamental shift in how data and services are exchanged, creating an explosion in API traffic. Kong is positioned as the essential infrastructure to manage, secure, and govern this new wave of AI-driven connectivity.
The Convergence of APIs and AI: A Unified Connectivity Layer
AI predicts that the distinction between traditional API traffic and AI traffic will disappear. He notes that even "boring" infrastructure problems like authentication, key management, and billing are critical for AI applications. Just as microservices abstracted away common functions into a gateway, AI believes the industry will do the same for LLMs (Large Language Models). Instead of building authentication and token rate-limiting into every AI application, enterprises will abstract this logic to a unified AI gateway.
Lessons for Founders: Persistence and Long-Term Trends
Reflecting on his journey, AI emphasizes that he never considered giving up, driven by the fear of returning to Italy as a failure. His advice to founders is to anchor their companies to a durable, decade-long trend, as success always takes longer than expected. By focusing on a long-term shift, a company has time to make mistakes, pivot, and ultimately find its place in the market. He stresses the importance of keeping the burn rate low in the early days to survive long enough to see that trend mature.
Conclusion
The evolution of API infrastructure for microservices provides a direct playbook for the emerging needs of a multi-agent, multi-LLM world. Investors and researchers should focus on companies building the essential infrastructure for AI connectivity—authentication, governance, and routing—as this layer is set to capture immense and durable value.