The People's AI
February 4, 2026

The Robot Revolution Is Here: Warehouse Automation, Humanoids, and What Comes Next

Robots: The Data Gap, Industrial Gains, and the Humanoid Future

By The People's AI

Date: October 2023

Robotics is quietly transforming industries, driven by specialized machines solving specific problems, not generalist humanoids. The critical bottleneck for widespread, reliable robot deployment is data, creating a massive gap compared to large language models.

  • 💡 What's the real-world state of robotics today, beyond the sci-fi hype?
  • 💡 Why haven't robots seen a "ChatGPT moment" despite rapid AI progress?
  • 💡 How will the immense data needs of future robots be met, and what are the implications for privacy and labor?

The robot future is not a distant sci-fi fantasy; it's already here, quietly reshaping our physical world. This episode unpacks the current state and future trajectory of robotics with insights from Jeff Mer (co-founder and CTO of Ambi Robotics), Thomas Fry (futurist and former IBM engineer), and Dr. Anaket Barah (Associate Professor at Purdue).

Top 3 Ideas

🏗️ The Data Bottleneck

"The biggest limitation in my mind is data. An LLM has gazillions of text data online... Now robotics is very difficult because we don't have a lot of real robot videos."
  • Data Scarcity: Unlike LLMs trained on vast internet text, robots lack real-world interaction data. This means progress in robotics is slower and more incremental, demanding new data acquisition strategies.
  • Reliability First: Robots need near-perfect reliability for physical tasks, unlike generative AI which can tolerate errors. This high bar requires exponentially more data to cover edge cases, making a "ChatGPT moment" for generalist robots unlikely.
  • Simulated Training: Companies like Ambi Robotics use physics models to create simulated environments where robots learn tasks like picking items. This approach helps bridge the data gap by generating synthetic training data before real-world deployment.

🏗️ Specialized Over Generalist

"We're talking about a 100,000x gap, which we call the robot data gap."
  • Industrial Focus: The immediate future of robotics lies in specialized, non-humanoid machines for warehouses, manufacturing, and logistics. These robots solve specific, high-value problems with high reliability, like Ambi Robotics' systems sorting millions of packages.
  • Form Follows Function: Robots are taking diverse forms—from bus-sized industrial sorters to mobile carts and surgical instruments—optimized for their tasks. This contrasts with the popular image of humanoids, which are less practical for most industrial applications.

🏗️ Future Abundance, New Challenges

"I think we're actually living in an amazing period of time in history. We're going through this transition right now which is making lots of people nervous and uneasy... At the same time, we're going to start discovering more opportunity than we've ever seen in all history."
  • Job Redefinition: Automation will consolidate jobs, moving human roles from manual labor to robot operation and oversight. This creates a path toward higher-level human endeavors and increased productivity.
  • Ethical Data: Acquiring robot training data often involves human teleoperation or "copy mechanisms" that record human movements. This raises significant privacy concerns and questions about fair compensation for data workers.

Actionable Takeaways

  • 🌐 The Macro Trend: The AI-driven automation is not a sudden, generalist humanoid takeover, but a gradual, specialized deployment. This evolution is driven by the fundamental "robot data gap" – the 100,000x difference in available training data compared to LLMs – which prioritizes high-reliability, narrow applications over broad, human-like capabilities in the near term.
  • ⚡ The Tactical Edge: Invest in or build solutions for industrial automation, logistics, and specialized service robotics (e.g., medical, waste management). Focus on data collection methodologies that generate high-fidelity, real-world interaction data, potentially leveraging simulation or human-in-the-loop teleoperation, while navigating the ethical and privacy implications.
  • 🎯 The Bottom Line: The next 5-10 years will see significant, quiet growth in non-humanoid, task-specific robots transforming supply chains, manufacturing, and healthcare. The long-term vision of ubiquitous humanoids hinges on solving the data and power consumption hurdles, creating a massive opportunity for those who can bridge these gaps.

Podcast Link: Click here to listen

Let's start with this idea of rather than robots made out of metal, we actually start growing flesh on these robots.

This is Thomas Fry, a former IBM engineer who is now a futurist and who spends a lot of time thinking about robots. If you could actually grow skin on these surfaces and actually grow flesh on these robots, then they'll get to a point I think by 2040 roughly that they'll be very hard to distinguish between humans.

Okay, real talk 2040 is not that far away and this futurist thinks that replicants like Data from Star Trek are on the table but what will it take to get there?

And before we go to crazy sci-fi lands, what will the world of robots look like? Where are we right now in the robot space given how quickly AI is evolving? And finally, what's a sneaky and overlooked role of data in all of this, your data, to make Planet Robot a reality?

We'll be exploring all of this and more on today's episode of The People's AI, presented by the VA Foundation.

I'm your host, Jeff Wilser, a longtime journalist and a host of the AI summit at Consensus. And I'm proud to partner with VA to bring you this season of the People's AI.

VA is supporting the creation of a new internet, one rooted in data sovereignty and user ownership. Their mission is to create a decentralized data ecosystem where individuals, not corporations, govern their own data and share in the value it creates.

Link: vana.org

Okay, robots. Obviously, this is a ginormous subject and we've got a lot of ground to cover. The global robotics market is estimated to be worth somewhere between 50 and 90 billion in 2026 Projections have it surpassing 110 billion by 2030.

In fact, our world has been roboticized for a long time. Many of us already have one in our homes. Consider the robo vacuum or the self-cleaning litter box.

But AI represents a paradigm shift in ability. Robots that can interpret and respond to the world around them rather than simply following a pre-programmed script. In other words, at least to some extent, we're on the verge of seeing robots that can think.

Where are we in that journey? And what does that mean for the world?

Here's a road map for the episode. We'll start with a quick landscape of where robots are now, where they might be headed in the nearest future, call it 5 to 10 years, and then we'll have some fun and look at robot scenarios for 2040 and beyond. Finally, we'll look at the main challenges that have to be overcome to make this a reality.

And every expert I spoke to stressed the importance not just of tech obstacles to overcome or materials or robot joins things like that but about the data that robots will need to kick things off I wanted to start with someone who's at the tip of the spear of actually bringing robots into the world.

So my background is in physical AI for robots to interact with the real world, picking items up, putting them somewhere, sorting, really organizing items in the physical world. And I started working on this when I started my PhD at UC Berkeley.

This is Jeff Mer, co-founder and CTO of Ambi Robotics. And in the world of robotics, we had almost no data that we could use to train robots to say pick up items like we wanted to.

So my PhD really ended up becoming about how to solve that data problem. And the way that we went about it was to take these classical physics models from even decades before that were analyzing whether or not robots could pick up items and use those to develop a simulation where a robot could learn to pick up items by essentially playing a video game where it gets points for successfully lifting an item out of a bin from some randomized scenario.

And this project was called the Dexterity Network or DexNet. And it was very successful. We were actually surprised to see how well it worked to train a neural network on just simulated data, take it to the real world, and it could pick up items it had never seen before.

That got us coverage in places like the New York Times, MIT Tech Review. We even got an invite to present it to Jeff Bezos at a secretive Mars 2018 conference in Palm Springs.

And we were definitely a bit nervous when he came up to the robot and started trying to trick it by putting various items in the bin. Like his associate took off his shoe and put it in there.

But the robot without missing a beat was able to pick these items up. And seeing his reaction to that, we realized that we had something that was not just a cool science project, but could actually make an impact in the broader world.

And that this was not something that a company like Amazon already had. So, we founded Ambi Robotics shortly thereafter with the mission of taking our technology, making a real world impact, and as we like to say, helping people handle more.

One reason I wanted to speak with Jeff is that he is not just in theory land. His robots are not just prototypes. They're quietly in the wild right now as we speak, stacking boxes and warehouses.

We are commercially deployed. We've deployed over a 100 robots into production with customers. Our customers include some of the top names in the supply chain. So, it is not R&D. It is functioning and helping people every day.

And we're proud to say we've sorted over 100 million packages and have operated over 250,000 hours in production.

Now, these robots might not look exactly like what you'd expect, and that's sort of the point. They will not be confused any time for a human and it's likely most robots won't.

It's a large machine and it's about the size of a bus takes up say like 600 square ft. There's a robot arm but there's also larger robots that we would call cartisian robots which can span over a large space in order to sort to many different locations.

And so these robots look more like what you might see in a shipping yard, like cranes that move shipping containers around. Some of our robots look more like that.

And so it's not really a human form factor. It's more of an industrial machine that has AI that allows it to adapt to different items that might come its way and handle variation that traditional automation couldn't handle.

Jeff's company, Ambi, is a good proxy for a new industry that you might be starting to see more of here and there. Nonhumanoid robots that are peeking into the corners of our everyday life.

At the Denver airport, I was recently in the United Club Lounge and was startled to see a robot spin around and carry dirty dishes away. And if you live on the east coast in the US, you might have seen Marty, the retail robot with only marginally upsetting googly eyes in a perpetual grin who glides around supermarkets looking for spills and hazards.

As Jeff Mer puts it, there are a lot of robots quietly out there and some areas where they're growing a lot. In manufacturing, there's been robots for a long time.

One of the areas where a lot of robots have been deployed quietly in the last 10 to 20 years is in mobile robotics. So more like carts that can drive around. And an example most people know of this is Amazon robotics.

These robots that drive shelves around to have products come to the person so they can pick your order. And there's a number of systems like that transporting shelves, but also transporting things like bins, driving around aisles in warehouses.

But now you're starting to see a lot more of them outside of the warehouse around people. So delivering people's orders and Door Dash, things like serve robotics and so on. That's been an area with a huge amount of growth, especially in the past decade.

That's been interesting. We have seen and we're part of a surge in logistics for robots that manipulate items in both the warehouse but also in manufacturing and there's been a lot of deployment also in waste management particularly sorting recycled materials plastics and things like that.

So that's something where I think there's actually still a ways to go to fully you know capture the the value there that the robots can generate. But that's an area that's surging.

Another one is a medical setting. There's a lot of surgical robots out there today, but they're continuing to grow because there's a lot of innovations in terms of how those robots can help with remote procedures, more minimally invasive procedures, and I think there's still quite a long way we can go there to continue to improve things.

But for every robot you see in a grocery store in the US, you're likely to see a small army in China, which has aggressively leaned into robotics. Here's Thomas Fry, the futurist.

Right now, there's over 150 robot companies in China, humanoid robot companies in China alone, and there's lots of them in other countries as well, in Korea and Japan and United States.

And so we're going to have lots of robot companies that are going to specialize in very niche topics like elder care and child care and things like that, but also things that specialize in cleaning your yard, mowing your yard, fixing your house, cleaning your garage, things like that. There'll be real specialized robots just for those things.

One way to think of robots is in two different buckets: the humanoid generalist robots that make all the headlines and dazzle crowds at tech conferences and then the more specialized robots like the warehouse robots at Ambi or the robot taking away my dishes in the Denver airport.

And everyone I spoke to thought we're likely to see more growth and adoption first in the specialized domain.

And do we really need our robots to look like us?

I think what we're going to find is that robots, the humanoid robot will only be useful in certain applications. There'll be other types of robots that'll be much more useful. Like maybe some with four, five, six arms, four legs, wheels rather than legs, that sort of thing.

And so we're going to start seeing lots of experimentation with different forms of robotics. That's where it gets real interesting.

Okay, it's go time. Let's pivot to where all this is headed. What will the world of robots look like in the nearest term? Let's call it the next 5 to 10 years.

I think specialized is still the way to go because you get more controllability and more explanability of when things fail.

This is Dr. Anaket Barah, an associate professor in the department of computer science at Purdue. He directs the interdisciplinary ideas lab which is dedicated to the research and development of robotics. Right now he's working on trying to teach a robot to cook.

Now back to the two different buckets of humanoid/generalist robots and specialist robot. Anaket also thinks we'll see more real world adoption from the specialists.

The humanoids factor will exist I think in homes, in caregiver facilities, in hospitals. These are spaces where humanoids will exist. They will not just do like the simple tasks but also become a very friendly person like a robotic person to interact with.

I think robots will come in different forms and shapes but it'll very application specific. That's because once you go application specific there may be money from that community then they like focus on something get the problem solved.

The more generalist we do it's like oh we are doing AGI we doing all of these fun things in the futuristic things but I don't know if that is 5 years into the future I don't see 10 15 years maybe it maybe generous robotics will take over but I don't see right now from an industrial product driven world I don't see a direct application.

Like if you're trying to sell somebody a 10,000 or I don't know 10,000 is cheap like $50,000 humanoard robot which will fold your clothes and do your laundry I don't know how many people will buy it

Thomas Fry the futurist imagines that robots will soon be heavily involved across a range of industries things like warehousing agriculture healthcare construction food service that sort of thing and that'll create a lot of robot integration opportunities for people.

So, we have to have systems that will set us up so that we can train the robots to do what we want them to do. But, we're going to have automated delivery stuff as example. And you're going to see this robot automated delivery as it'll be as frequent as we see the Amazon delivery trucks now.

So, anything we want, we can get in much shorter time than we get it today. As for robots inside our homes, I think most of them will start off a little pricey, but the price will drop rather quickly as the competition is actually really stiff because the market is just absolutely massive.

But if you can imagine having a house and I think of drones as being robotic as well. If you have drones that actually dock with your house and recharge the batteries on your house.

If you have drones that actually dock with your house and they take the trash out or they take the sewage out or drones that will actually deliver water to your house, all of the things that you normally have coming and going through.

So, you could have a house that's sitting out in the middle of nowhere and it actually will deliver stuff and the services that you need without having to be connected to anything else.

Now, of course, this will almost certainly have an impact on jobs and the economy. Exactly how is obviously outside the scope of his episode. But even Thomas Fry, who's generally a tech optimist, concedes that jobs will be lost, although he also sees this as a path toward future abundance.

A lot of jobs will start consolidating. they'll start going away and they'll have people that are in charge of a team of like five or six robots and they'll get it get the work done that way and then it'll get down so that you have one person with 10 or 20 robots and and so the jobs will get fewer and fewer.

At the same time, we will expand our thinking and so we'll set our goals for much higher expectations. we we'll look at bigger and bigger accomplishments.

So if you think about today, somebody can accomplish today what in just a short period of time what it took somebody 20 years ago like many weeks to do. Especially writing an article if you use AI and you and you can just bang it out in 15 minutes and it took somebody like half a day to do that otherwise.

That that's just one quick example, but we're going to see things that rather than make a statue that's 6 ft tall, we might decide to make statues that are 100 ft tall and building buildings and building other types of enterprises. we just set our sights at much larger scale.

But Jeff Mer, the co-founder of AMBI, cautions that this kind of robot revolution is unlikely to happen overnight. And for a variety of reasons, we're unlikely to have a chat GPT breakthrough moment like we had a few years ago with LLMs where one day no one had heard of them and the next day they're ubiquitous and that's all we can talk about.

Instead, Jeff expects a more gradual shift. There's a lot of folks who think that there's going to be this transition point, this this great leap, if you will, where all of a sudden robots will be able to do anything just sort of like we had this Chad GPT moment with large language models.

And the thinking there is that we once we get enough data, all of a sudden they'll they'll be able to do all these things. But the challenge in robotics is more so in being able to get high reliability and quickly get a robot to do some task with high reliability rather than it is about having a robot do anything kind of well where it works you know 50 70% of the time.

And so where I see things being transformed in the next say 5 years are these areas where we can get robots deployed and doing reliable tasks in the next few years and then we'll collect more data to start getting those working very very well.

So some of those areas are continued innovation in warehouses like I mentioned because we can get robots to production today. There's that data flywheel already spinning improving the AI models and there's definitely five more years of work at least for the kind of stuff we're doing in warehouses.

Manufacturing is another area. These high mix manufacturing operations use AI. There's plenty more that will happen there. And then around our daily lives, I think the the mobile robots will continue to grow.

I think that with stuff in our home, I think I'm not as convinced that they'll be making a broad impact there in the next 5 years. I think there's potential in certain applications though.

For example, if we think about assisting elderly people in in their homes, if if we can go fetch things, even if it's very slow or doesn't work once in a while, there might be some value there. And so there I think we'll start to see those kind of early markets opening up where maybe robots that can grab items in the home might be able to help in in certain say assisted living centers or or things like that.

Um, but I think it will take longer for this vision of say Rosie the from the Jetsons in your house.

All right. Well, since Jeff mentioned the Jetsons, let's go there now. Let's have some fun. What could the world of robots look like in say 15ish years, 2040 and beyond?

For Anakut, he's excited about progress in medicine and healthcare and hopefully we'll be able to solve problems which were previously we rejected them because they were too hard to solve.

Like in in the pharma space I want these robots to do 24/7 testing of potential candidates for cancer like how do I build this?

I mean we are trying actually we are in some some way we are you know collaborating with some pharma companies here at Purdue where everybody's doing different things but lab automation is a very key important thing where not not only try to blend different chemicals and see the reactions in maybe 200 of them in parallel as opposed to just one person looking at one experiment and using machine learning to figure out what the best candidates for certain solutions may or maybe surgical robots, right?

That is a space where I think immense I think right now surgery any form of surgery always has has a lot of side effects and always a potential for side effects or potential for bad things to happen because you're literally going inside the body and trying to tear up things right.

Can we make them very very precise way more precise than humans can we do that can if we can yeah there are robots which are magnetic driven robots which like which will you ingest like a pill and they can go and let's say eat away your fat or you know try to focus on targeted delivery of like very targeted surgical without almost like zero blood loss situation.

So so these are spaces where I think like in the medical health care space I think that would change completely 20 30 years from now.

Then again on a less sunny note for every new robot in the hospital we might see a robot on the battlefield. If we were still existing after 20 30 years, if robots haven't taken over, I think the defense is another space where things will be become humanless.

I would say so to some degree like soldier not not humanless human lives will still be lost but I think soldierless is is is my my anticipation because it's becoming already very tech heavy in the defense space which of course raises a ton of ethical questions about how these systems okay I'm not I'm not even saying it's good or bad I'm just saying what I'm observing because I mean a lot of our lab is is funded by DoD and I can see well do now but you know I can I and see what what the general trend is going towards.

Thomas Fry envisions a future where robots are doing a good chunk of the work. Let's just take warehousing as an example. We'll we'll have robotic trucks that are transporting goods all the way across the country.

I don't know if you've seen these robotic forklifts. They're just actually the forks themselves are robotic and they go under a pallet and pick it up and move it wherever they want to go with it. and where we're starting to see some of the early early edges of this stuff, but the whole supply supply chain will get roboticized and so it becomes very seamless and very invisible.

So as an example, if you want to start a business and you have the physical products that you're selling around the world, you want to roboticize supply chain so that you can actually create your product and actually have it delivered where wherever it needs to go. Make it seamless and invisible.

Getting into agriculture. Agriculture farmers spend enormous amount of time working the fields right now having robots out there, robotic tractors and robotic combines and working the fields actually picking fruit from the trees, harvesting the grain, all of those things that will get very much automated.

And one person owning a 100,000 acres of land doing it all with robots. We're we're going to see that as quite a common site. I think we'll probably have a thousand different kinds of robots.

I think the delivery bots that can actually work as drones that fly and deliver a package to your house, that's that's one type of robot. We'll have humanoid robots that will walk amongst us. We'll we'll have robots that can do every possible task we could imagine. Elder care, child care.

So, what will our homes look like in this robot future? The robot will have sensors that know exactly as soon as you wake up. They'll know exactly what's going on. The whole house will come alive as soon as you wake up, whether that's in the bathroom, whether it's in the kitchen, whether it's in the entertainment room.

And so everything will be prepared for you as you walk around the house. I've been thinking about this for quite a while about if I could have a music player that actually played music that invigorated me all the time and it would know if something was starting to depress me so change to different music.

So this would be music would be the performance enhancer. Music playing in different rooms all the time and that would keep you in a positive mood, positive frame of mind. It wouldn't be too loud, too soft and then it would ask you lots of questions. It would keep you engaged. You'd want want to have your life fulfilled, but most likely you would have several different operations going.

You might have two or three different businesses that you're running from your house, and you would have the robots helping you helping you do that. And they would bring you up to speed first thing in the morning, give you a summary of what happened throughout the night, and then get you up to speed, and then you could be off and running.

Jeff Mer sees robots being especially useful to let humans stop doing dangerous things in dangerous jobs. One of the things that I see and and AMI as a whole really believes in is that humans will not have to be doing these sorts of very dangerous, dull, dirty tasks anymore that are, you know, dangerous for for our bodies, injury-prone, manual tasks.

And so we see a world where robots are doing those sorts of things and humans are still needed but more in the role of robot operators. How can they sort of guide these robots to do the tasks well? Help them deal with corner cases or failure modes that we might not have captured yet.

Moving around heavy boxes is something that's something we see a lot. unloading trucks, loading trucks, taking them off of shelves, putting them on pallets, so on. These sorts of things are very tough for people to do because there's a lot of repetitive upper body motion with items that can weigh 50 lbs or more.

And people as a result tend to not stay in those jobs very long, even if they like them, because your body is not really expected to be able to do that for a very long time. Um, so I think those kinds of things are ones where people will be able to do them very well.

I'm excited about what's possible in terms of surgical robotics. That was actually the first thing I started working on in grad school. And I'm keen on on this area because I think it allows there to be a lot more sort of flexibility for things like remote procedures.

It's like making less of a bottleneck of that surgeon being like right there in the room doing things. And of course, we still have a surgeon in the loop. Things will become more and more automated though and and surgeons can potentially lift to a higher level of operation. Arguably, they already do today.

And I think that that technology progressing forward will be really really interesting. I think robots will also be a really important part of the loop of essentially doing new sort of scientific discoveries, especially when you think about like developing new materials or or drug discovery or things like this.

If if you listen to folks talking about an AI like they want to be able to close this loop with the real world so that the AI can propose experiments, but it needs to get feedback from those. I think robots are the way that AI gets feedback from those experiments to make the whole cycle run faster so we can innovate and find solutions to problems like diseases much more quickly.

And I think that in 20 years if we will start to see more of that. Um will we be all there to a utopia of abundance? I think probably not on that time scale. But I think that we will be a lot closer.

And this this notion of people being robot operators will be a lot more of a common shared notion amongst normal people.

And at this point in the show, it is time for my confession. I have a confession. It's my personal theory. This is just me here. That sometime within our lives, I don't know if it's 2040 or beyond, but but sometime in our life, you will be in this scenario or a scenario like it.

You'll be at a bar. You will look across the room. You'll see a woman and you will not be sure if she is human or a robot. I'm embarrassed to even use this word, but she might be a replicant.

Now, 5 years ago, I would have thought this sounded insane. But hear me out. We already have voice chat that can fool people and sound human. We already have what amounts to real time deep fake Zoom calls that can fool people.

And we're building humanoid robots. It's not really a leap in my mind to imagine bolting together this realtime chat with humanoid tech. And all it's missing really is synthetic facial expression and eyeball movements.

Now, those are no small things, but feels like tech that can be solved. It's just my wild theory, but it's one that Thomas Fry agrees with.

Let's start with this idea of rather than robots made out of metal, we actually start growing flesh on these robots. Won't take that long to get to the the flesh bots.

Flesh bots.

Yeah. I mean, this this gets into the whole sexbot world and all that, but actually they're they're much more humanlike than you can imagine. if you could actually grow skin on these surfaces and actually go grow flesh on these these robots and then they'll get to a point I think by 2040 roughly that they'll be very hard to distinguish between humans as we get closer to consciousness.

I mean we still don't even know have a good definition for what consciousness is but it's once you have this self-awareness ability to respond to all kinds of external things around you, they're going to start feeling much more humanlike and then we'll get into all these questions of do they have rights and those those become really difficult questions to answer and whatever hurdles like a Supreme Court would put in place as to you have to achieve this before you actually can be considered having rights then we'll we'll pass all of those a few years later.

Okay. Now, let's pivot back from distant sci-fi and crazy Jeff theory to planet Earth and here and now. What will it take to get to this robot future? Whether the realistic gains from specialized robots or something more daring and exotic like the humanoids.

As Anakit rightly points out, there are a ton of technical hurdles that need to be cleared. The physical world is messy and complex. Robotics is an inherently an interdisciplinary problem.

So there are big problems in each of those disciplines. Material science, you know, how do we build lighter robots? I mean people are using different kind of material properties, physics, different physics, different chemistry to make things happen. So there are there's obviously limitations which come because of form factors.

You know robots need to move. So how do you move them? Do you move them with spring based mechanisms or other you know gear based mechanism? Those are all different problems.

But I ask Anakets, okay, it seems like in the past few years there's been a ton of leaps and bounds progress with generative AI and less so with robotics. Why is that? What's the core constraint? Why isn't there a very big what we call a foundation model for robotics like an LLM for robotics or a big generative AI for robotics? Why has there been a gap?

The biggest limitation in my mind is data and NLM has gazillions of text data online right you can feed it you can you know fine-tune it everything is there you want to generate images or videos it's also easy because there's lots of videos on YouTube there's billions of videos on YouTube you use that video to predict the next I mean all of these are basically smart prediction algorithms right you got given x some constraint and you're predicting some some other things based on the constraint.

Now robotics is very difficult because we don't have a lot of real robot videos. I mean there there isn't anything right. So what do we train things on? We don't know how the robot will move like we cannot predict how the robot will move unless we have seen how it moves.

Jeff Muller agrees as the robots are thirsty for data. So just starting by talking about modern AI which is really these deep neural networks. They are incredible tools but their their lifeblood if you will is data.

They need to have huge amounts of data before you can start to see these emergent properties like being able to talk and and say reasonable things come up with poetry and so on. And just to put the scale of that data in context, it would take a person over a 100,000 years to to read all of the data that was used to train GPT that a lot of people interact with on a regular basis.

And so if we then try to compare that with what we have in robotics, in robotics, the amount of data that people are typically training on is maybe more like a few years worth of data. So, we're talking just comparing this language model with robotics, we've got about a 100,000x gap, which we call the robot data gap.

But then you start to think about, well, is is that even the right comparison? This is really complex. We're talking about not just images taken over time, but also interactions with the world. What does a robot feel when it's grabbing things, putting them away?

And so there's reason to believe that the amount that needs to be used for training could be even more than was used for these large language models, especially because we expect high reliability when we interact with items in our physical worlds.

There's a a tendency to really quickly be dismissive of hardware products, I believe, when when they don't work well. Like if your phone stops making calls, stops being able to connect to the internet, it's not going to take you long before you're going to be looking for a new phone. And I think the same problem exists with physical robots. We certainly experienced that in our industry.

But getting enough data to actually close that long tail of getting to 99% reliability, you need essentially exponentially more data over time. And so we don't know how to solve this problem today and it's a very important piece of the puzzle.

Okay, this begs the question, how will these robot companies acquire this data? How are they getting it now? And here's where it's easy to envision a thicket of privacy concerns.

There are multiple companies, big companies where people are actually capturing robotic data from interaction with humans. So there are some some companies they built like a a copy mechanism. So you wear sort of this Iron Manish like thing device on your arm not the whole body and then the way you grab things or the way you interact that whole thing will also copy your motion.

So now if you if you if you have these large data producing companies to let's say okay grab this mug or a cup 5,000 times now you start to now get motion data into your machine learning systems to really okay this is how it is this is how things and now tomorrow I get a different mug I know kind of know how to grab it I know what the action of grabbing means there's also the system of tea operation where a human is operating the robot from afar.

You might remember the robot that went viral a few months ago, Neo, from the company 1X. Neo was a robot designed for your home to help you fold laundry and whatnot. But at least in the short term, Neo could at times be operated by humans somewhere else on the planet remotely, meaning they could be staring in your living room.

The reason, by the way, is to help give them more data. It all comes back to data. And the company says they have privacy preserving features in place. But still, you've got the concerns.

This also raises yet more questions of the data workers who will be helping gather data for robots. In an earlier episode this season on the people's AI, we talked to data labelers who will be spend 14 plus hours a day doing things like saying this image is a cat. This is not a cat. And get paid below minimum wage.

Are we going to enter a new phase of underpaid robot data workers who spend all day filming themselves washing dishes and folding laundry hoping to make $5 an hour?

Now, as you consider that, there is still one more wrinkle

Others You May Like