Ventura Labs
February 2, 2026

Gavin Zaentz & Pranav Ramesh: Leadpoet, Lead Generation, Intent-Driven Sales Automation | Ep. 79

How LeadPoet's Decentralized AI Solves Sales' Stale Lead Crisis by Ventura Labs

Author: Ventura Labs

Date: [Insert Date Here]

This summary is for anyone tired of bad sales data and curious how decentralized AI can fix it. You will learn how LeadPoet delivers high-intent, double-validated sales leads, disrupting a stagnant industry with real-time data quality.

  • 💡 How does LeadPoet ensure lead data is fresh and relevant, unlike traditional providers?
  • 💡 What specific mechanisms does BitTensor enable to scale lead generation and qualify intent signals?
  • 💡 What is LeadPoet's vision for automating the entire sales process, and how does it plan to achieve it?

The sales world runs on leads, but the current system is broken. Incumbents like Apollo and ZoomInfo churn out stale, low-quality data, leading to 60-70% bounce rates and wasted effort. Pranav Ramesh and Gavin Zaentz, co-founders of LeadPoet Subnet 71, are here to change that. They explain how BitTensor's decentralized AI network is not just generating leads, but revolutionizing the entire sales funnel with unprecedented data quality and automation.

Top 3 Ideas

🏗️ The Validation Imperative

“If you can't validate it, miners will feed you not what you want and they'll get rewarded for it.”
  • Quality First: LeadPoet's core principle is rigorous data validation. Miners are only rewarded with Alpha tokens for leads that meet strict quality standards, including valid emails, LinkedIn profiles, company details, and roles. This ensures every lead entering the database is legitimate.
  • Double Check: Unlike competitors who validate leads once, LeadPoet performs "double validation." Leads are re-checked for freshness and relevance immediately before being delivered to a client, preventing wasted effort on stale contacts.
  • No Catch-Alls: The system explicitly rejects "catch-all" emails, which often lead to high bounce rates. This focus on "email okays" significantly boosts deliverability and response rates for users.

🏗️ Outcompeting Incumbents

“The current lead providers are incredibly painful to use because they keep providing you the same leads over and over again. And a lot of those leads are expired.”
  • Decentralized Scale: LeadPoet leverages BitTensor's global network of miners to source leads at a scale centralized providers cannot match. This allows them to quickly build a vast, fresh database, moving from 500 to 50,000 leads per day in just months.
  • Intent Signals: Beyond basic contact info, LeadPoet's agent competition identifies high-intent leads by analyzing social media posts, PR, company filings, and funding rounds. For example, a company that recently raised a Series A and mentioned hiring is a high-intent target for a recruiting firm.
  • ImageNet Analogy: Think of BitTensor like the ImageNet competition, but for any kind of intelligence. ImageNet gathered researchers globally to compete on image recognition, pushing the boundaries of what was possible. BitTensor does the same, but for diverse AI tasks, with subnets like LeadPoet focusing that collective intelligence on specific problems, like finding the perfect sales lead.

🏗️ Automating the Sales Funnel

“Our golden eggs down the line is automating the entire sales process. Not just providing you with highly with high intent lead.”
  • Full Automation: LeadPoet's vision extends beyond lead generation to automating the entire sales process. This includes qualification, outreach via email and LinkedIn, and even booking meetings directly onto a sales executive's calendar.
  • Cladbot for Sales: The goal is to free sales professionals from 80% of their time spent on manual outreach and qualification. By providing high-quality data and automation tools, sales teams can focus solely on selling to genuinely interested prospects.
  • Ecosystem Integration: Partnerships, including Nvidia Inception and DSV incubation, provide resources and a network to accelerate this vision. LeadPoet also supports other BitTensor subnets by automating their sales efforts, fostering a stronger ecosystem.

Actionable Takeaways

  • 🌐 The Macro Shift: The shift from centralized, static data aggregation to decentralized, real-time, incentivized intelligence networks is fundamentally changing how data-intensive industries operate. BitTensor's model allows for the rapid, high-quality commoditization of digital goods, like sales leads, by leveraging global, competitive AI contributions, making it possible to outscale and outcompete traditional data monopolies.
  • ⚡ The Tactical Edge: Investigate subnet opportunities where incumbent data quality is low and validation is a core challenge. Builders should prioritize programmatic validation mechanisms in their subnet design to ensure miner output aligns with desired quality, creating a defensible moat against synthetic or low-value data.
  • 🎯 The Bottom Line: The future of sales is not just about more leads, but smarter, fresher, and more relevant ones. LeadPoet's approach signals a broader trend where decentralized AI will automate entire business functions, turning previously manual, data-heavy tasks into efficient, high-signal commodities. Position your business or investments to capitalize on these automated intelligence layers over the next 6-12 months.

Podcast Link: Click here to listen

I'd say the core essence of what's required in a subnet is how do you validate it? If you can't validate it, miners will feed you not what you want and they'll get rewarded for it.

So, you need to think of how you're going to validate it in usually a programmatic approach that allows you to have a lot of scale of minor outputs and you're able to determine is this legit? Is this what I want? Like, is this minor actually providing me compute? Are they providing me a real lead? Are they actually committing to a repo and improving the code on that repo.

If you're able to capture that aspect, the rest comes naturally. A good example of this is Lead Poet. If we were to try and compete with an Apollo or Zoom Info, they already have hundreds of millions of leads in their database.

In order for us to reach that point, leveraging the minor network that we have in order for all of them to leverage the compute, the models that they have on their end to source 50,000 leads a day per different mining per 5 10 UIDs for example. This is just an example out there.

Then we just aggregate all of those results and then we store it in our DB. We're able to get to that level so much quicker than if we were doing it by ourselves. Same thing with Shu. They're aggregating compute from so many different miners to compete with some of the larger incumbents in the space.

Our north star in building this subnet is data quality. This is just part of ensuring that we're ensuring the data coming in is good, but also on the way out because any funnel is only as strong as what you put in at the top.

If you put in bad data, even if you refine it and do research, you're not actually be reaching out to legitimate emails or people that are in that relevant market. So yeah, that's why we focus so much on the quality of the leads.

This is incredible. I cannot believe that we could have miners essentially do all of this intelligent work for us. They're compensated through the alpha token that the subnet gives them and then we are able to take that product that miners give us validated through the validators and then actually provide it to the client for a much lower cost than what we're currently doing right now.

Just that whole business model that BitTensor actually allows is incredible in my opinion and what got me into it.

Welcome to the Ventura Labs podcast. Today we're joined by Prrenav and Gavin, co-founders of Lead Poet Subnet 71. They share their journey from meeting at NASDAQ and discovering BitTensor to launching a subnet for high-quality lead generation, scaling from zero to 1.1 million validated leads in the first few months, the upcoming agent competition for qualification and automated outreach, partnerships across the ecosystem along with inclusion in Nvidia's inception program and DSV incubation.

Now back to Prrenav and Gavin with the 79th Ventura Labs podcast.

Okay, so Prrenav and Gavin of Subnet 71 Lead Poet, how are you guys doing today?

Doing good. Doing good. How about you?

I'm doing great, man. I'm talking to the co-founders of Lead Poet here. So, let's just jump right in. You guys have over a million leads source, 1.1 million leads as of now. Talk me through what this means, what this subnet does. Why is this number significant?

Totally. Yeah, it's been definitely a great journey these past two months coming from zero leads to now in the seven figures. Initially we were pulling in about 500 leads a day. Now we're up to 50,000.

The general purpose of why are we doing this is we're building up a large inventory of leads. So when our users have demand for this commodity we're producing from the subnet, they can tap into a wide database and our agent will surface the leads that have the highest intent for what they're actually selling.

Let's take a step back from this 1.1 million number. What is the basis for this subnet lead put? I know a lot of development time went in before actually launching this. I met you guys at ingame last year, few months before the subnet launched and you were already telling me about this idea and exactly how it was going to work and everything. So, what is Lead Poet? What was the reason for making the Lead Poet subnet?

Okay. So, I'm going to maybe start this off with a story if you have a bit of time. So me and Gavin we met at NASDAQ beforehand and while we were over there we started to talk about crypto.

I was launching something called Bitcoin options at that point of time and I was trying to learn more about crypto and I met Gavin over there cuz a lot of people directed me towards Gavin while I was there telling me that if you're going to get into crypto you need to talk to Gavin cuz he's the crypto wizard within the firm.

I was like oh okay I'm going to go and talk to him and he told me about various different types of crypto. He told me about BitTensor in specific and as somebody who's been a strong proponent of open source development since I was probably 16 17 years old I found BitTensor fascinating.

I did a bit of due diligence and then me and Gavin got together the next few days and start to brainstorm the best ideas we could possibly have to build a subnet on BitTensor and the number one thing that we both landed on was Lead Poet and specifically the pain point of lead generation.

Me and Gavin have had prior startups in the past. We know how painful lead generation can be. If you use current incumbents out there like an Apollo, a list kit, a Zoom Info, you'll realize that when you reach out to leads that they provide to you, 60 to 70% of the time you're going to receive an automated response back from those leads saying that this inbox is being deactivated due to an influx of spam emails.

So, the current lead providers are incredibly painful to use cuz they keep providing you the same leads over and over again. And a lot of those leads are expired.

We knew that we could completely change that up if we created Lead Poet which would leverage BitTensor to source highly qualified double validated leads which is how the subnet currently works right now to make sure that the leads are all high intent the which means that those leads actually are in market to buy a product from the customer and then number two they're validated so the contact method is checked making sure that the email inboxes the LinkedIn and any other contact methods that be added in the future are validated so that the person who's actually using those to contact them will actually get a response.

Response and it's not something like where it's spam, right? Their email just goes directly into the spam box and they never receive a response because of that reason.

Since then, we started off probably going to so many different events in March. We went to Endgame like you mentioned, we went to DNA, we went to a few others as well, and we've been trying to get a subnet since then because we had that strong a conviction in this idea.

Come in October, we were incubated by DSV, we got the subnet, and we've been building ever since. And here we are doing this podcast with you.

All right, crypto wizard. How'd you find BitTensor?

In terms of how I came across BitTensor, I personally did found crypto for myself in the 2020 as many did got involved with Bitcoin. Then shortly after Helium, a project focused on creating these this deep pin network with Laura.

Now it's 5G, but I was setting up a bunch of those hotspots mining. And I just found it very compelling that people could be earning this incentive to actually be building something that has real world value. BitTensor is kind of the next evolution of that.

Why I got attracted to it was I'd been following it when it just had one incentive, but as they expanded to the subnetss and having multiple incentives and the ability to customize incentives, that showed me that this was kind of the next frontier in terms of a blockchain.

The ability to customize those incentives was really compelling and it see projects and the the ability they were able to channel compute, channel storage, just made me really attracted the project and I've been an investor since 2023 in BitTensor.

2023 precedes a lot of probably current subnet owners in terms of investment into BitTensor and you said you got here before subnets were even a thing. There was just one net. How did you build that conviction when subnets weren't even here yet and how is your conviction from then to now changed over time?

Actually I invested in 2024. I found the project in 2023 and was following it and the subnets is what really triggered my interest to invest because the ability to customize the incentives to have multiple projects.

When it was just one network, it was interesting, but it was similar to many crypto projects where they're incentivizing one specific task. But the when we launched the subnets, I saw it as like a layer one for any application where you're able to define the incentive, define how you validate it, and channel these digital commodities in the most effective manner.

Prrenav, what was your first impression when the crypto wizard started talking about BitTensor to you?

My first impression was this sounds really interesting. I've never really it was at that point of time 2024. I had never really gone into like decentralized AI applications before. I understood decentralized apps. I understood AI as a whole, but I never really combined the two.

When he first talked about it, I'm like, that's fascinating. I can't believe this is the first time I'm hearing about BitTensor. So I go back and I do some due diligence and I'm like I read the white paper and I was I was absolutely fascinated.

I was like this is such an interesting application. I described it to a few of my friends who were founders as well and they were like that's incredible. Like if I if my company doesn't do that well, if I exit my company, the next thing I'm going to build is going to be on BitTensor.

It just seemed like it was massive information gap because anybody I talked to BitTensor about once Gavin told me about it, they instantly had like a light bulb moment like this is incredible. I cannot believe that we could have miners essentially do all of this intelligent work for us.

They're compensated through the alpha token that the subnet gives them and then we are able to take that take that product that miners give us validated through the validators and then actually provide it to the client for a much lower cost than what we're currently doing right now.

Just that whole business model that this that Dit Tedra actually allows is incredible in my opinion and what got me into it.

You guys go to a lot of events as you've said and I assume you talk to a lot of people that don't know what BitTensor is yet. So when explaining what your subnet does, how do you explain BitTensor enough so they can understand what your subnet does?

Funnily enough in lead generation there is a concept of contributors that provide data. So, it's kind of a cheat code to explain this. We explain it as we have contributors around the world providing us data and we reward them with that data and then as we roll out our agent tool competition in in the coming weeks or actually this coming weekend that will be also tapping into contributors around the world to compete to develop the best model.

I think when you say contributors people understand it a lot better than obviously miners people are like what does that mean? Are they are they getting down in the shafts and pulling up gold or are they just running or even if you know about miners you think about Bitcoin they run one type of code continuously it's not like they're optimizing that obviously they are through the hardware but the software is pretty stagnant so I think the best way is just through that contributor lens and then also if if they ask more questions and want to get deeper you can mention that they're competing and the ones that are able to out compete and provide more data or better models, they're going to earn a larger portion of the pie.

Generally when I've put it through that contributor lens, a lot of people understand it quite quickly.

One of the most elegant examples of this, and we've used this example before, both me and Gavin, is ImageNet. It's something that happened even well before BitTensor came around. ImageNet was essentially this this I'll call open source competition. not really open source, but it was the first of its kind.

It essentially had researchers from across the globe compete on a public data set and then try and figure out ways to recognize images using their models to the best of their ability using this public data set. They created a leaderboard from which of the models these researchers were publishing were actually doing the best.

Each of these researchers were creating models that were competing with the next researcher. The results that came out of the imageet competition were so incredible that it beat all other image recognition benchmarks at that point of time. It blew it out of the water and it kind of illustrated to the entire planet exactly how strong open source competition can be.

BitTensor builds on that. When I communicate it to other people, we communicate it to other people, we kind of communicate that the power open source competition and global competition can provide to a specific industry or specific source of intelligence is absolutely incredible. It can change the game and it can make it that much better.

That's what BitTensor provides. Each of the subnetss define what that source of intelligence is that we're trying to improve. For us, it's lead generation, but for Bitfind could be deep fakes. For any other subnet out there, it could be a completely different thing. That's what makes BitTensor so fascinating.

In terms of your specific competition, Lead Poet, what is the design here? What are the miners rewarded for?

Right now this is before we release the agent competition that Gavin was mentioning previously. There might they're rewarded for sourcing highly qualified quality leads. This could be a lead with a valid email, a valid LinkedIn, a valid company LinkedIn, a valid location, and they're providing us all these different a valid role like all of these different details that we validate on our end that validators validate on their end in order to make sure that these leads the lead quality details are all perfect before it enters the database.

Only if the lead quality that the minor has provided us are up to our standards, do we actually give them the incentive associated with that lead. If miners are providing us bad leads, leads with certain bad data, they don't receive any incentive. It doesn't enter our lead pool or our lead inventory as we like to call it. And they don't get rewards.

You qualify a lead based off if they have a LinkedIn with a location, company, and email?

So, that's how we validate that the lead is legitimate and that the data is high quality. But the way we actually qualify a lead based on a user's request is through this agent that we're developing right now through this competition that's going to be launching shortly, which will be based on the user's request, dynamically looking at the database and surfacing the leads that have the highest demand.

Whether that's through social signals, whether the company has reached some milestones which which trigger interest in the type of product the user is selling, we surface those leads and that's what resembles the the qualified nature of of the leads we're giving to these end users.

Determining give me a specific example here.

So a good example is recently one of our clients they're a private equity firm and they're focused on both buy side and sell side and they gave us the task the ideal customer of these cyber security firms that manage cyber security more medium medium to small size firms that are either interested in selling their firm or interested in potentially acquiring another cyber security firm.

What the agent does is based on that criteria like so you if so there's kind of three initial attributes you know that it's cyber security you know that you're wanting to speak to the owner and you know that the company size fits in this in this range they also had a country and state they only wanted us specifically in a certain region in the US so that's the first criteria to say is this are these leads in the target market or not but we go a step further and say looking at those leads do they actually have intent to sell their company.

Has the person worked there 30, 40 years? Are they reaching a point where okay, yeah, you've worked there 30, 40 years, you have a higher intent to sell than someone that started a new cyber security firm last year. So that would be someone that has intent to sell.

On the other side, we also surfaced some leads that had intent to buy. They, you know, they've already acquired a cyber security firm in in the past years. If they've done that, they clearly have a model to be growing their firm and they may be interested in another acquisition.

That kind of shows two different idle ideal customers under under one target market that that user had and you know they came back and saying like wow these are really interesting intent signals you provided. I wouldn't have even thought of some of these being intent signals, but this has been really helpful. And, you know, we're going to be a continuing user of the platform, you know, when we have our full platform launch in next month.

Currently, we're doing a more white glove beta for these early users just to give them a taste of the data quality. And, so far, we've had, you know, very positive feedback all around.

How are you guys doing the validation here for the intent signals? Is it all going to be based off LinkedIn data or will there be other sources?

That's a great question. Right now we validate when we're when we're qualifying a lead and we're checking whether this lead is the right fit for a specific client and what they requested for. We're checking whether this lead is within the right ICP.

For example, if that client requested for cyber security companies, we do not want to give them a lead which maybe is showing intent but is in a completely separate industry like in agriculture, right? Like we want to make sure it fits the ICP that they've defined.

Number two, we also check whether or not this person has a decision-m ability so they're not like a lower level analyst who maybe has no influence on whether or not that firm buys the product that the client is providing. We make sure that they actually have the ability to make that decision.

Finally then the most important part and what you just asked about is the actual intent signal. This could be something as simple as a social media post from that client. Doesn't need to be specific from LinkedIn. It could be specific on X. It could be other social medias etc that we verify using our own set of APIs.

We have scraping dog other APIs etc to actually check whether the content of that social media post that the minor linked which says that this person is in market for the product that this client is selling is legit. So it's a real social media post from that specific company or specific lead. It's not a for one 404 error out saying that it's a non-existent social media post.

Then number two they'll actually check the content of that social media post. For example, if the client is somebody who's selling a CRM, is a social media post saying that this person actually is looking to buy a CRM, maybe their old CRM provider was not good and they want to switch from that, right? That's a very clear intent signal that we can derive from that social media post, whether it's LinkedIn X or so on, and say, "Okay, this is a good lead. It fits the ICP criteria. It fits the decision-m criteria. We should provide it to the client." That's the entire process of how it works from top to bottom.

On top of social signals also looking at PR company filings really any data that that we can pull up to find the intent but social media is going to be a great one to show some of those intent signals as well and one of the other things to build on that is like a series A round or like a seed round right like if a company has recently raised a series A round and they've maybe said in that round that they're looking to hire maybe a recruiting company is a good person to be connected with that company right because they want to spend a lot of that money on hiring a recruiting agency may do very well connecting with that company.

There's several examples that we could use to actually signal intent, but social media is a primary example of that.

How are you creating these maps for these different companies of this LinkedIn and this X and this series A, these are all the same company. How do you validate that?

When it comes to LinkedIn for example, each company is going to have a specific URL pattern that we save in our database. So if we if there's a specific companies we'll call it infosys or we for example say let's just call it atian right it's a company and it has a specific company LinkedIn and a specific URL around that company then if it has a post on LinkedIn from that company in specific we can easily attach it to that specific company LinkedIn this is just on the tech end because the URL pattern is going to be the same if it's on the profile end so the person who actually works there the profile pattern so the the actual URL pattern of the profile is going to be the exact same of the actual profile that person has on LinkedIn.

The URL pattern for the profile is going to match the URL pattern that's used for the post from that person as well. So those are a few ways that we can validate the posts. When it comes to series A funding rounds, etc. We'll check for reputable business sources such as a Forbes for example or a New York Times for example. A lot of times you'll also find the funding information on the company LinkedIn directly. They'll post about the funding round, right? I think it's pretty natural for a company to post about the new funding round that they just had and that's again verifiable just from the URL pattern from the company LinkedIn. We can map that to the post as well and see if the URL pattern matches.

How would you map a LinkedIn account to an exit account?

On the site most companies they have their socials linked there. So, you can kind of consider that a source of truth where the X linked on their website is their actual X. You know, obviously if company's hacked and someone takes control of the socials, you know, there is an edge case there. We're validating that all these are connected and that's kind of in a company profile. This is the company, this is the industry, this is the size, headquarter location, this is the X, this is the LinkedIn just validating all of those different different social.

You have a you know as you're saying a source of truth there which is you know obviously in bit tensor if you don't have a source of truth you know the miners will tell you that they are telling you the truth when they just made up an X called Atlassian X and this is definitely this is definitely Atlassian's X and it's like you know obviously not the real one.

We've definitely considered that just to ensure that all these signals are legit legitimate that the that these models are surfacing and also since this is an open competition we can actually look in the code and see you know is it actually pinging data universe to actually get this data like what's the X count coming back like how is it structured is it structured in a way to you know just ask ask an LLM to produce saying this is the best lead ever and you know yeah this guy definitely wants to buy your stuff and you It's fully synthetic and it's not based on, you know, a source of truth.

At the current moment, the way the system works is you're just having miners get as many leads as possible, which are LinkedIn accounts with a location, company, and email, or is it they have to fit a specific criteria?

It's a broad spectrum. We accept any lead that is legitimate on all the data fields, but we do have some extra incentive on kind of ideal leads that we want in our database. A lot of startups some specific target markets that some of our early users have requested. We've also added some extra incentive there. So, there is some flexibility, but any lead we will accept and give some reward for. But if you're hitting the, you know, the target leads that we really want, you'll get extra incentive as a minor.

Miners can get leads however they are able to intelligently build their systems and we accept it if it meets all the criteria.

In terms of double validation, Prrenav used the term earlier. He said we double validate leads. What exactly do you mean by this?

This is a great question and it's the crux with a lot of issues with some of the other lead sourcing mechanisms out there like an Apollo or Zoom Info. What they do is they'll validate maybe the email and some usually not the LinkedIn but Zoom Info does validate the LinkedIn as well before it enters their database. They'll validate it maybe a year ago, 2 years ago, 3 years ago, but people's roles change, people's emails change, people's LinkedIn changes, right? Like even the actual profile link itself can change.

When you double when you don't double validate the lead, if you validate it only when it enters your database, but you provide it to a client 3 months, a year later, that lead could be completely stale. So you've only single validated it because you want to save cost. But you've now given a lead that's now going to hurt your client because they're going to send an email. They're going to construct a massive personalized email for this specific client and it's just going to go into a void, right? they're not going to respond because that email is no longer legit.

Double validation is the crux of lead generation because the email the lead that's given to the client, it needs to be validated at the point it's given to the client to make sure that the client is getting a lead that is fresh. It's not something that's stale that's been validated 3 months ago and is no longer legit. That's the most important thing that we make sure we do.

To build off of what Prenov's saying is our north star in building this subnet is data quality. This is just part of ensuring that we're ensuring the data coming in is good, but also on the way out. Because any funnel is only as strong as what you put in at the top. If you put in bad data, even if you refine it and do research, you're going to, you know, not actually be reaching out to legitimate emails or people that are, you know, in that relevant market. So yeah, that's why we focus so much on on the quality of the leads.

How do you determine if a lead is still relevant? What is the process?

That comes into the qualification agent. So we have the database, we have the leads, we know what industry are they are, what role they're in, but the agent will dynamically based on the user's request pretty much determine if the leads are relevant or not to that user cuz just an industry match, just a role match may not be good. Let's say you're trying to sell a CRM but a company just announced a partnership with Salesforce or something like they're not likely going to be in the market for a new CRM right now.

It's not just looking at the criteria, it's looking at the intent because if someone is, you know, or or they just acquired a company last week, they may not be looking to acquire a company, you know, in the coming months. So it's about combining the criteria of the lead with also the intent signal, which allows us to see who's who may look good on paper, but is actually not a relevant contact to reach out to.

Just to give a very simple example of that, say a lead comes in two months ago and we check its email. We ping the SMT SMTP servers via an API. We check whether the email is legit. Then the next thing we do is we check whether the LinkedIn is legit, the company LinkedIn is legit, and the role that that person is working at is legit. They're currently working at that role. It's not a prior role, right?

Two months down the line, right before we provide that same lead to the client, the qualification agent needs to check all those details again. needs to check, is this email legit? Is this LinkedIn legit? Is the company LinkedIn legit? Is this person actually working at the same role they were working at two months ago? It could have changed. They could have moved roles. They could have moved jobs. That lead may not be relevant, may not even be in the same industry as what the client is looking for anymore. That's the reason that double validation is a crux and that's the process of how we go about validating each of those fields.

Have you guys ran into any issues with the API for checking if the email is valid and then you send the email and it still gets bounced because they have some spam filter?

No. So, the way that the these APIs work is they pretty much test that exact thing to ensure that this email is valid and accepting emails and it it's a fairly commoditized service to be able to check that. We haven't run into any issues any issues with that.

One more point to add to that is a lot of other incumbents will allow both catchall and email. A lot of the bounces can happen from catch all emails because a catch all email is essentially all of the emails you send to a specific domain, they get kind of aggregated into one inbox, right? A lot of times those emails can lead to a lot to a higher bounce rate than the email, okay? Or even a higher spam rate. it's just going directly into the spam account for that email.

What we do is we do not any allow any catch alls into the database. We only allow for email okays or email ballots. So that in of itself ensures a much higher data quality when it comes to the emails you're emailing versus other incumbents which allow both caches and email okays.

A recent announcement you guys had was you are now part of the Nvidia Inception program. What does this mean for you guys? What is this change in your trajectory of Lead Poet?

We were quite excited to see see that we were accepted to the program. What that really means is they provide a bunch of startup AI startups specifically with a bunch of resources. It includes compute, other discounted rates for other services. It also provides a pipeline for our product to others in the program where they can directly access it through Nvidia's platform. The third aspect which is also very important is access to an investor network that can tap directly into these AI startups that are in this program.

Does this program require you to give up some portion of equity or no some investment from them?

No. It's more of just a resource program and obviously getting us in the pipeline to be able to more easily access Nvidia services, easy easier access to Nvidia GPUs and other products that they have as well. Luckily doesn't require equity or alpha or anything that that we need to give them.

To be quite honest like the investor network and the overall ecosystem and access to other founders within the AI space are phenomenal in and of itself. Since we are like we are a startup that we are an AI startup and we operate on a lot of like AWS and so and cost associated around that. Having this partnership actually helps definitely tame those costs for us as a business so we can focus a lot of that money into other things such as hiring, marketing, selling the product, right? Like all of those are important things that we want to do on this on this business and we want to bring that revenue back into the ecosystem, the BitTensor ecosystem. It's a phenomenal partnership for that, right? Because it's allowed us to save a lot on our tech costs and allocate a lot of that money into other things that we know are very important.

On the note of partnerships and we lost Prenavagget. At least it was right after he finished his set. So yeah, it it was that was perfect timing. Partnerships.

I went through your ex today and these were all the partnerships I had saw. If I missed any, let me know. If any aren't valid anymore, let me know. Of course, Bitsack, Metanova, Shoots, Video, Giannis, Reszi, Bitrex, Liam, Zeus, Score. I think you guys now take the title for most partnerships. I I think we're fill you in. I listed the partnerships I saw of you guys. What do these partnerships entail?

I think we're still second to shoot. Someone tagged me in some post and there is actually a leaderboard of partnerships and we were second. I think we're we're approaching we're we're definitely close there. What it means is it's been a real great testing round for our data quality ahead of you know starting now now we have some paid users that are actually you know paying and getting some of of the leads and giving us you know additional feedback.

It was a great training round to give initial leads get more ideas from the users you know what they'd like to see in the leads what different fields and it's been great to also just help other summits in the ecosystem give them help on their sales but it doesn't really end here in terms of the partnership. It's just been an initial training ground with them, but they they'll be getting priority access to our products including the the the qualification product.

As we approach towards the you know end of the first quarter we're also going to be doing some automated outreach and we want to be providing them to these users first and you know really helping BitTensor as a whole go to market and through our specific partners but you know any real subnet we're glad to speak with and help them in in really however we can because there's so many back to the term wizards in in BitTensor that are very technically skilled and may just haven't may just have not dealt with go to market or sales before or maybe they've been ancillary to it.

This will really automate a lot of that painful process that takes a lot of training to know how to qualify a lead, how to do outreach that actually is effective and it just speeds it up so they

Others You May Like