
Author: Philip Sentner
Date: October 2023
Quick Insight: The crypto world is a fragmented mess. LeiFi builds the meta-aggregation layer that makes multi-chain transactions, especially for Real World Assets (RWAs), feel like a single click.
The crypto world is a fragmented mess of chains, assets, and liquidity. This creates a headache for users and institutions. Philip Sentner, LeiFi's CEO, explains how his company builds the essential infrastructure to make this multi-chain world fluid, especially as Real World Assets (RWAs) come online.
"I see tokenization as the successor to digitization."
"Interoperability is the default today. It's given, it's necessary to even start any company in the space."
"The modes in times of AI are brand, distribution, data... and network effects."
Podcast Link: Click here to listen

I see tokenization as the successor to digitization. Once we see a vehicle being represented on chain or a real estate or a mortgage contract, there's a whole new future of financial products coming.
This is amazing, blockchain has an amazing future ahead. This is just the beginning of that, right? And yeah, we simply place a big bet on this.
I have with me Philip Sentner, the founder and CEO at Lei data. I want to understand which assets are available on which chain ever and at which price are these assets available.
On top of that, I need to understand if I want to get from A to B, what are the potential route you would have to swap an asset so that you can bridge and then swap again.
So this is complex and implies the user would have to sign three transactions. That's a manual effort. You definitely don't want to have a failed $10 million transaction.
So instead of having to do that, Lei built a multi-chain transaction rail. This rail is a set of smart contracts that lives on all these chains and allows us to combine a variety of transactions into one. Now we are doing two new things.
Philip, you've been recently talking about **RWAs** and how that impacts interop. I would love to know what your thesis is.
I don't believe that **RWA's** impact interrupt per se. It is more that interrop is the default today. It's given it's necessary to even start any company in the space. You need multi-chain connectivity and may just be that your users have an easy way to find you. All right, it's about the users journey.
And then with others of course we also see a lot of minting on multi-chain native assets or token standards like **OFTs**, **NFTs**, or **CCTPs**. This is very interesting to see obviously so that shows that the the motors of Bandi or like the the default paradigm already is multi-chain for these people and that's obviously great to see.
We see art being minted on so many chains from Avanosis to Sana, Sooie but also companies building their own chains and issuing assets there like Robin Hood chain for example right so it is as multi-chain as we foraw it years ago and for LeFi that's obviously great.
In terms of the subcategories of **RWA's**, treasuries, private credit, what is the most interesting to you right now?
I think there are multiple observations. We see a lot of assets being tokenized but that alone is not enough, right? They need to be liquid and they need to find distribution and then ultimately they need to find demand and distribution and demand is a very typical chicken and egg problem, right?
So why would a metamas take the effort to specifically index **RWAs** if they don't know what is more important to them doing that or making sure they support mega eth on day one?
So there as per usual conflicting priorities and I believe that institutions are still figuring out what distribution even means. Does it mean to be tradable with lower fees and 24/7 within Robin Hood and Revolute or does it mean we want to get the DeFi native crowd into stop trading meme coins and start trading real stocks but on chain?
I think there is little right or wrong here right now for me we are very very early in the **RWA** phase and I'm so happy to see that institutions are getting their their hands their feets wet and trying things out but for me it's still pretty much a trial period for many of those and there is still a huge lack of domain specific and industry specific knowledge within these companies and we will see how this will plays out.
It's funny like we are in a I was just discussing this with my team this this we it feels like the past four or five years have been the DeFi era, DeFi memecoin, just trying to figure itself out era and now this kind of feels like ending. The space is consolidating heavily. People don't believe in protocols that much anymore. You can see this reflected in early stage funding which is pretty much dried out at the moment.
At the same time we see the big players coming in but they obviously very early. So what I'm foreseeing is kind of like a a hole in which we see volumes potentially being very isolated around certain in certain isolated environments.
I give you an idea. So let's say JP Morgan issues a trillion dollar in tokenized assets within Canon network within their own private chain. then we might see a lot of crypto volume turnover between their clients on that one chain but none of existing crypto really profits from it.
I mean the outside image is there and then JP Morgan's savings on all these things and your new profits new business models potentially might shine a good light on the industry but everyone else that is in this industry right now doesn't directly profit from it indirectly maybe but we're going to have I for me crypto right now feels weird it's going to be a what I'm saying is it's going to be a longer transitionary phase.
Now I moved far away from that question what does **RWA** mean for interop and made it more what was what does **RWA** made to make for for DeFi crypto what does it mean for it.
Quick note before we continue about 96% of the people watching this aren't subscribed you can see it right here if this show has been useful subscribing makes a real difference it tells YouTube this is worth showing to more people and it directly helps us bring on the founders and operators you want to hear from. So please subscribe to the show.
For the user can you just give them a peak into what kind of complexities you guys have to deal with so that a swap is done in a click?
For LEA the story is very simple. We are looking at exponential fragmentation across blockchains, multi-chain token standards, native token standards, intermobility solutions and exchanges. And now with other **RWAs** it's fragmenting even more.
So we spent five years on building that orchestration layer or coordination layer at the top that allows to go in and out of any position very flexible very modular very chain agnostic solution agnostic at the same time scalable and enterprisegrade reliability.
So we have really been focusing on that specifically last year because the years before to be frank we had so many scaling issues was like tough because we were growing way faster than we could handle. Now we have solved these issues and you know we have that enterprisegrade experience that Robin Hood expects.
From here on what have we really built? So on the it it starts with aggregation aggregation of APIs and therefore data. I want to understand which assets are available on which chain ever and at which price are these assets available.
On top of that, I need to understand if I want to get from A to B, what are the potential routes? So, smart order routing, all of that is still on data level, right?
Then you would ultimately end up with scenarios in which you would have to swap an asset so that you can bridge and then swap again. So, this is complex and it implies the user would have to sign three transactions.
So instead of having to do that, Lei built a multi-chain transaction rail. This rail is a set of smart contracts that lives on all these chains and allows us to combine a variety of transactions into one. We can swap bridge swap with the one transaction.
Instead of sending a transaction to a bridge directly, it goes to our smart contract and then the smart contract sends it to the bridge and the bridge sends it back to our smart contract and then it gets back to the user. these kinds of things. The ability to send meta information alongside that, unpack that, make sense of it, do that in a as gas efficient way as possible. That has been asked for the past few years.
Now we are doing two new things. The one thing is well we have been able to swap brid swap and we have also been able to leverage the idea of of sending of calling any smart contract at the end of this complex transaction.
But here it already becomes difficult, right? The idea of calling any smart contract at the end of like such a transaction maybe another deposit or morpho deposit or or maybe pandle or hyperlink whatever this is obviously hard to scale if you think about it. All these protocols have different smart contracts. So you can't it's really hard to abstract that right.
We offered that and with many enterprise clients we went down the road of writing custom smart contract integrations for all of these things. It was just not that scalable. So over the past 2 years we have developed a virtual machine and that virtual machine really expresses a domain specific language in Typescript.
You can essentially you can type whole user journeys of check my wallet balance if big enough then if bigger than $1,000 take $1,000 swap it into X and then deposit as collateral into AVA take out a loan as much as possible or just 75% of that and then take this further to swap it into something else.
Right these kind of strategies multi-chain are now able to be we can now express that in Typescript and we are getting we are getting executable call data out of that virtual machine. Not only that it has its own transaction simulator which is super fast and cheap.
So we are we have actually call data we can rely on and this allows us to make multi-chain journeys much more scalable. It is very unique. We have really put in a lot of R&D into all of this and we are rolling this out with big clients like Alip Pay now and that's that's great.
This is one thing another thing we have been doing is so instead of just aggregating because aggregation is also hard to scale we now are pairing this with an open solver marketplace.
It's essentially an intent system and it takes an intent and then it auctionizes it across a network of solvers. I don't really believe that the market is ripe for like very decentralized massive solver systems yet because there are not enough solvers and it's already profitable to run them and the space is more fragmented than solvers could cater to them in an economic way.
So, aggregation is still the the key solution for everything but it allows us to plug in third party liquidity. It allows all these people that tokenize assets to bring their own liquidity and market make it themselves and we have all the distribution power.
So in a sense we are the best solution for anyone who tokenize assets to actually find distribution and find their access into wallets of any kind from a ledger to a treasure to a Robin Hood to a Metamas to a Phantom.
Again here Leifa is really the middleman that brings these two sides together and they can actually accelerate the adoption of tokenized assets and that's something we are super bullish on and what we're heavily investing in as a company right now.
Were there market conditions that demanded this kind of complexity into the product or is it something that you foresaw and then you decided that okay I think that the market is headed in that direction and why not build for it right now rather than when the demand arrives?
We can start with the latter like for LEA it was just clear from day one I joined the space in 21 in February March April March 21 never touched crypto before I didn't even have a centralized exchange wallet so I was late to the party but I came into this with a very technical perspective actually the first blockchain I ever engaged with was was the was called again with Treasure T.
No. Um draw. No. Um damn. There's the they were sponsoring Formula 1 and everything. Not Tron, not Ton. **TZOS**. Exactly. So the first chain I engaged was **Tazos**. I was like, "Wow, they so and I was actually excited about them because they were supporting variety of smart contract languages. Like you could write your contracts in Pascal. you could write them in in uh in Typescript in Python.
So that's great. Like if you think about imagine EVM would support write smart contracts in a variety of of languages like that that would like kickstart adoption even more.
Why does it matter and doesn't that increase complexity?
Certainly increases complexity for others like for for the chain and so on but if you can write smart contracts any language with certain frameworks maybe restrictions and then it compiles into smart contract code I mean obviously it it just makes adoption much easier so the Rust developer can can write his smart contract as easy as the Python developer as easy as the web developer that comes from Typescript so it just allows a faster adoption of of blockchain if there were would be more language.
It it has its downsides obviously because you need to maintain these frameworks and compilation can be tricky and you know languages are complex and also still evolving. So it's not like that it's languages are languages are frameworks themselves right.
However I I I think that excited me about Tesos back then and then I quickly moved on to EVM understanding that this is where things are happening actually. And uh but I came from a strong database background and looking at blockchains was very clear hey blockchains are just databases.
We call them execution environment sometimes but a database is a exeu ex execution environment too. So I give you an example if you are in adtech you know if you go on a website and within a split of a second they have to decide which advertisement to show you based on your past behavior and preferences.
So it's obviously you as an identifier with lots of relational data of your past behavior and that bidding all of that has to happen. So so these big tech companies they often use possql and then they write they write the code within the database. So as scripts like everything runs within the database itself. The database is the execution environment like a smart contracts runs on the blockchain.
However like we looked at blockchains and were like it has to be multi-chain. There was no other way that it would scale. It will all converge on on one chain. There will be and and we're not even the end of it. We're going to as more value chains move on chain. you're going to see blockchains for different data types like spatial data, graph data, and, uh, or vector doll data, whatever.
And then of course there's also a strong economic incentive to have your own chain for control, compliance or simply a greater degree of value acrruel. We see this with stripe and tempo, circle and art, Robin Hood, Robin Hood chain. This is just the beginning of that, right?
So very interesting and yeah for us it was very clear from day one and then the signs were also clear that we were not the only ones because the second hackathon we did was around interoperability was like in in ETH global hackathon around interop where we met the connects team the hop team optimism arbitromeum so they already the signs were were there from day one for us and yeah we simply placed a big bet on this which was funny because even a few months later in race our seat round many VCs were not seeing that it's as yeah yeah yeah and fragmentation is will compared to the internet fragmentation crypto will remain real and they're very different dynamics as well like in in in chatfi you don't have the economies of scale that you have.
In crypto when it comes to autoflow for example the degree of economies of scale here are much higher. So that why why simply because settle settling transactions is more expensive like like well there is a technical cost to it like blockchains have gas cost money and that fragmentation also comes with rent seeking at different steps.
So if you can batch transactions and if you have more data on a where like that the fragmentation is so real that also capital efficiency is a whole different whole whole different ballgame. So the more data you have and the more orderflow you have and I separate these two things. Data for example can be quoting data like which gives you indicator of where money is flowing to next and transactions order flow combination of these things is powerful and that's something Neifi is optimizing for right we optimizing for as much data and as much transactions as possible and then later on there is really where we can provide benefits to market makers or even market maker ourselves if you wanted to go down that route.
But yeah similar what Robin Hood is doing with Citadel I think we have a very similar positioning down the line.
When you are just purely aggregating versus when you are trying to unify liquidity. How does one view that? I mean what is the mental model to understand the difference between a pure aggregator versus what Leifi is right now?
There is no pure aggregator. There are different kinds of scenarios in which aggregation makes sense. I give an example like the what everyone knows is a DEX aggregator. That's what we grew up with in crypto. The DEX aggregator makes sense.
So, so you have a blockchain and you have u multiple dexes and AMMs. AMMs are based on liquidity pools. The price function of liquidity pools reflects supply and demand of that pool. But if I have from the same asset pair multiple pools, it would make sense to split a transfer across that these pools to avoid too much of a price impact, right?
The less you change the pool you tap into the less price impact there is and then you can even combine this with additional data you have around liquidity flows and optimize even further. This is what DEX aggregators are doing and DEX aggregation essentially to keep to simplify that they split trades across homogeneous liquidity pools. That's a problem and that's been solved by the 1 in 0x fly trades of of this this world.
Now there is bridge aggregation. Bridge aggregation solves a different problem. For bridge aggregation, it's more like imagine you have a bridge. A bridge supports different routes, different chains, but then per chain to chain route often also different assets. So on MATC you have a different native token than on arbitum.
So obviously there is matic as a token and then on arbitum you need eth u so you have this there but then between salana in arbitum it's soland e and so so there are different assets for different routes and so on and we and then there are some so tempo from stripe they will they will support stable coins as underlying guest token so it bridging is complex the moment you want to aggregate two bridges you have to map the different chains.
These chains are expressed differently by different bridges. So, Stargate, for example, they use short shortcuts. They don't use chain IDs. They use like so for SOL or ETH for for ETH. And then someone else like a cross might use like the EVM native chain IDs. and then someone else that thinks a bit bigger and outside the Ethereum context might use the chain agnostic improvement proposals chain identifiers right so this needs to be mapped that's a manual effort on top of that different assets this is about route coverage routing correctly routing smartly and then you might want to leverage data telemetry and be like hey we have these different routes for this A2B route however they have different bridging types times different costs and different and different security assumptions.
So you might be well actually for this very large transfer we would prefer mint and burn route. It's it's it's slower but it's more safe. And we don't want we definitely don't want to have a fail $10 million transaction. Right so we go for mint and burn the these things. So you need again this is really about smart routing in a more qualitative sense less in a quantitative sense like a DEX aggregation.
And then we are not operating with homogeneous liquidity pools but with heterogenous APIs. Totally different game has nothing bridge aggregation has nothing to do with dex aggregation. Now lei is a meta level aggregator if you want. So we are above all of that. So and we so we do both and combine it.
And we have our own bridging system as well, our own DEX aggregator as well to cover up for gaps in the market. We want to ensure transaction delivery. We want to make sure we can cater to all our clients demands and very often these clients are new chains like Mega E right now or they are enterprises like OKX and OKX is is centralized exchange. Simless exchanges over the past years have all been following the same strategy which is have your own chain, have your own noncast total wallet.
So there's no incentive demand wise to support OKX for any of the DEX aggregators. Historically we have been knocking these doors. Now we can simply supply deploy our own DEX aggregator and that's that's great that that just gives us less independence of the parties but we're going to remain market neutral always.
So if a new dexter aggregator is launching on that OKX chain, we're going to support that too. Yeah, but you can see like Lei is just much differently positioned. We understand everything that's down below and we aggregate everything that's down below.
So would it be fair to say that it's a combination of 1 in and Dbridge?
Yes, in a sense we have that intense system of Q intense system like Dbridge for the bridging side of things. At the same time we have our own dexator like 1 in but on top of that we also aggregate all the one ines that are out there. Right. Exactly. Right.
Because because these DEX aggregators they perform they perform very differently. The DEX aggregators compete on smart contract efficiency, on algorithmic efficiency, and the amount of liquidity pools they tap into. And there is a weird trade-off they have to make. The more liquidity pools you tap into, the slower your your internal reasoning becomes.
So, for example, ODOS is has been incredibly good at generating very competitive quotes, but it takes them two two and a half seconds to arrive there, right? like that's just very slow for a Binance that wants to have centralized exchange- like connectivity. That being said, aggregation of DEX aggregators makes sense because they support different chains and they are because they they compete on the amount of liquidity sources they tap into.
On different chains, there are different liquidity sources they can tap into. So, it's really a race for each single chain to to be competitive as a Dex aggregator. So having full access to everything and our data shows that we have a very equal distribution amongst all the dexators we support. They're all claiming to give the best quotes. They're all claiming this is funny.
But it is that's not the truth. They perform differently well and it also depends on traits. Some optimize for larger traits. Some plug in some PMMS at the same time to cater to these larger traits. And yeah so they're all different aggregation and having access to all of them makes sense and then again also from the enterprise perspective lea is used because you want redundancy right if one of these payers get down you want to have a fall back and you want to have a fallback for each scenario for each route for each chain so that's why Lei has over 800 B2B clients a day.
How do their demands change from Leifi I mean can you take a few types of clients and tell us as to you know what are few things that are more important to type A clients versus type B?
I mean institutional set trading desk has for example maybe demand for GUSD and PIUSD much more than a normal wallet has right and then a normal wallet has to cater to crypto retail so they're much more keen and interested or have the need to be on that next new hyped L2.
So last year we were looking at barane sonic stables plasma and so on and now all eyes and and now these these days everyone's looking at mega e so for them it's a race and therefore also for us it's a race to always be that day one and we've been getting better at being that day one historically we have been dependent on third party bridges so they had to deploy first and they often only made it day one so we had to catch up on that first day volume wise that was not optimal for.
Now we can land on chains day on ourselves. That is great. So less independence of on third parties regarding expansion efforts and yeah other demands might be we have some wallets that want to have want to support tokens that take a fee while swapping. Others are forbidding them. Some need better screenings, some like AML related screening, chain analysis, ellipses and so on. others lead more or less of them is like yeah so demands definitely differ here then also when it comes to nonem you know that one wallet just got incentivized to launch on ton or tron therefore they want to be there and we are their swapping bridging partner so we are getting push going there faster than we otherwise intended to we typically try to gather as much customer demand for something as possible but also of course we don't want to win We don't want to lose our major clients.
Last year was crazy. Demand for sooie, demand for Tron, demand for so many other things. It was really intense last year.
Do you still maintain that or I mean what I'm trying to understand is what kind of an effort it is at an infrastructure level to add these things and to keep maintaining them?
Maintenance is a huge effort. Back then in the past we saw so many things breaking that we really doubled down on observability. So we have lots of we have shadow transacting or shadow quoting or we do some we we we we have graphana dash boards about all of our implemented solutions. Everything's very well monitored these days and that allows us to proactively deactivate certain chains and inform partners.
Hey, it's not available anymore because of a lack of liquidity for example or that no one is offering a route anymore. there's native bridge but we didn't see enough customer demand on your end maybe we deprecate this. So it's a constant dialogue with integration partners and with customers and looking at general market data. We have a huge data team. We understand the market very very well in terms of where is liquidity flowing, by whom, why and all of that.
So yeah, we have become quite good at foreseeing things and having these things. But yes, everything is updating. So constantly there is something to fix. And there are also con new edge cases that come up. So it's we are 120 people now. We're going to be 200 people by the end of the year. And with **RWA** it's not going to be easier, right? Like there is so much more coming but we are perfectly positioned for for this fragmented future and being core infrastructure for this very fragmented market.
How do you think of growth? Would you say it is a lull phase or a bare phase? Whatever you want to call how do you think of growth in that time?
I honestly I I I made a point where I'm ignoring these market conditions. I really I I mean I care from a from a metric standpoint, but Lei keeps on growing no matter if bear or bull. That's great. Of course we also grow like this, right? Wait, I need you to put this in stream. It goes like this. But it keeps going, you know, it keeps going up.
So problem is growing, our solution is growing, the mode is growing. we have it's really hard now to catch up to us and there was even even with AI because we use so much data telemetry to optimize things already that's just tough and it takes a product in the market to understand what is it you need to build actually you know you can probably use AI know to plug a few APIs together but then once you take it to market you're like oh damn I need this too and then sure you might be able to catch up on these things but you need code you really understand because there's so many edge cases you need to handle and I think with AI it would still be very messy to to do that but at the same time we have 70 developers and we make heavy use of AI now like AI is one of our major topics internally in terms of doing everything and you just just just like we are not lacking behind in terms of leveraging AI to get somewhere so we are just getting faster competition can build something fast too now But we are also getting faster and building this out, hardening our mode, winning more deals, expanding, you know, being more aggressive.
Quick note before we continue. About 96% of the people watching this aren't subscribed. You can see it right here. If this show has been useful, subscribing makes a real difference. It tells YouTube this is worth showing to more people, and it directly helps us bring on the founders and operators you want to hear from. So, please subscribe to the show.
I'm curious about how that use has changed over the past year or so. I want to understand from a product standpoint like can you really trust that at this point or is it to make sure that your productivity gets a boost?
Well, it's nice to smoke test. you can build prototypes within 24 hours, take it to a client, ask for demand and you know and then ask AI to make it open adjusted to our existing interfaces internally and so there just things like in combination with large enough context you can do a lot I think it's really worth it by now if you have so so it's I was discussing this with a friend yesterday in times of AI it is even more important to have a micros service architecture you know we had this microser hype for a long time then people were like back to monoliths and we also started building Leifa as a monolith keeps you way more efficient with things and and microser we we we we turned back to microservices as we had to scale and now with AI it's really worth it to have microservices because you have isolative code so you can reduce the overall scope and context the AI has to operate with in terms of understanding certain interfaces and whatsoever.
And that's just u a design choice that really matters nowadays. And it allows us as a company to to play around test a lot and go to market faster with certain things.
Was this very conscious that you wanted to go to businesses as clients? I mean 800 clients don't happen overnight, right? there must have been a strategic decision as to you know this is where we want to go?
I've I've never seen defi B2C as sustainable except for the very few money markets like an alpha that that have something adorable. Even alpha for me used to B2B play more today is a B2B play. It's just that if you think about it just makes sense. So for me from day one it was clear I was only always believing B2B here.
The modes in times of AI are brand distribution data. If you leverage historic data for data telemetry like data you have to accumulate that is the mode and it's network effects if you have that. So those are four things Levi has all of them. We have a strong brand within the space now need to tackle fatfy and become more known in that area. We have strong distribution. We leverage our data which is competitors will only potential competitors will only learn that as they go to market and wonder why if they would benchmark us against them why we are performing better. It's because we make heavy use of data telemetry. And then yeah the third is network effects and yeah that we have that tool in multiple ways.
Intents have been pretty big in this space but they're not everything right. I mean I I think you've said that intents don't necessarily solve interop on their own and there is a lot that goes on beyond just enabling intense can you unpack that unpacking what what intense allow us to do?
Intense. So in general the intense system we have per se is a special one. It supports multiple transaction flows. So you have might have heard of resource locks. But also cross came to market with 7683 and then there are different esco mechanisms. You can also use Lei's intent system supports all of them. So any flavor which is great. So whatever you want to put an emphasis on as as a business you can use that also down the line on the verification side. So what are we using to verify these things? You can go with a wormhole, you can go with a polymer, you can go with a layer zero. We support all these things.
And that's beneficial too because again it allows us to cover many use cases in different different context with different requirements on security or speed or costs whatever is necessary what whatever is important to us and then we also support a variety of transaction domains. What I mean by that is same chain swaps, bridging and we can easily add more like we can also add per or or lending intents or whatever like you're really flexible across the stack and that makes our system very very special. None of the other ones have that. Other ones would be like an across or a Dbridge or a unis swap x or a 1 in fusion x these things right like we have built a system we we waited for a long time to address the topic intense like and this was one of my hardest choices back then I saw this intent narrative coming up I was like hold on what's happening and obviously it was a real from a theoretical standpoint upon it was a real threat to Levi but with everything in life it's or especially as entrepreneur timing matters so when do you do the right thing and we waited patiently and then there was another narrative chain abstraction which is technically what we have been doing since day one but again something we waited out and not overreact not pivot into AA like a count obstruction not pivot towards gas obstruction not do all these things and then we had the ability the chance to buy Catalyst. I invested in Catalyst myself a few years ago as a business angel and we eventually bought the company which was a great choice. Alexander and Jim are incredibly smart and driven and have built in my opinion the best system in this market when it comes to intense and intent execution. So very happy about that. However, still LEA remains chain solution agnostic, market agnostic, market neutral. So any other intense system is also implemented or will be implemented and it's going to be a fair auction across everything.
What made you hold back from indentance for such a long time and then go and acquire an intense system is a multi-sided market or slow it's a multi-sided market and the way you execute a multi-sight market is but you have a trigger and act problem you don't get the solvers if you don't have orderflow you don't get the orderflow if you don't have solvers so the best way to execute this is to proxy one side tell me so how would you proxy the supply side in an intent system?
Well, you would aggregate the execution, right? Lei has been executing a marketplace strategy from day one, right? So, I knew I would be perfectly positioned to build this out. I also knew that the first thing like 7683 were looking at the standard back then and we're like, is this the end of we saw the downsides of it? We're like, okay, there are probably other things out there. So we just did more research and we waited a little bit and we also wanted to understand different auction mechanisms. So we looked at cow swap, looked at 1 in fusion, looked at unis swap x and all that and we wanted to understand where is all this going and yeah and then once we felt confident that we have seen enough to make a choice here whom to buy or to build ourselves we made it.
In terms of competition look I've seen the dashboards more often than not you guys are the leaders right but only the paranoid survive life. What do you think of competition in general in the space in terms of direct competition?
Nothing comes to mind by now, right? We had some direct competition in the past. Direct competition kind of for us doesn't really matter anymore. There are some there there are some they have a few deals but they don't really matter to us. And then there is the friendmy situation we have with many of the people we aggregate. Friendmy essentially means we are friends but also there is some competition in there. A bridge will always prefer to be implemented directly than being put into competition by being aggregated and then in in a B2 integration. However, the reality is that top level applications prefer to have redundancy and fallback mechanisms and best price execution and the best route