
By Turing Post, Date Unknown
Quick Insight: This summary is for builders and investors who see past the awkward hardware demos to the massive intelligence layer being built underneath. We are moving from scripted automation to general-purpose reasoning agents that inhabit physical forms.
While robots at CES still look like awkward toddlers, Jensen Huang and Elon Musk argue that the intelligence layer is already here. The transition from scripted movement to generative AI simulation is creating a world where hardware is just a peripheral for a central reasoning brain.
"I don't think people understand how many robots there's going to be."
"When you picked up a tennis racket, you embodied the tennis racket."
"It's not capability. It's trust and uptime."
Podcast Link: Click here to listen

Hello my fellow humans. During his conference at CES, Jensen Huang, CEO of Nvidia, was asked when robots will have truly human level capabilities. And his reply was really surprising.
When do you think we're going to get robots that actually have human level capabilities?
Because I know how fast the technology is moving. Because unfortunately robots at CES were not aware that Jensen Huang was so optimistic.
Well, robots indeed still look funny and awkward and clumsy. But jokes aside, last year robots had a phenomenal progress driven by generative AI that help with simulation and training.
And I have no doubt that what we've seen at CS is already old news because don't forget we already have robots that are almost boringly helpful, which is exactly the point: personal assistance.
Robots like Richi Mini, who was the true star this year, now combine vision, microphones, and ondevice genai, so they can recognize people, respond to voice, and run custom behaviors from the hugging face model ecosystem.
This is the first step toward contextaware home robotics that can see, hear, and interact using real AI. Plus, Richie Mini is open sourced, so the community can develop whatever use cases they need it for.
Warehouses and logistics: Autonomous mobile robots move from scripted routes to adaptive fleets. They now replan in real time, coordinate with humans and handle mixed inventory.
Amazon, Aado, and DHL are running system that look less like automation and more like swarm logistics.
Factories: vision guided robotic arms cross the threshold in perception and dexterity. They can now handle variance, different parts in perfect placement, fast reconfiguration.
This is why micro assembly and inspection lines are getting shorter, faster, and cheaper to redeploy.
Hospitals: service robots stopped being pilots and became infrastructure. Medication delivery, UV disinfection, internal logistics, even surgical assistance are now integrated into hospital workflows.
This shift is crucial. It's amazing. It's not capability. It's trust and uptime.
Agriculture: autonomous tractors and harvesters move from GPS followers to decision makers. Precision spraying, selective harvesting, night operation, lower chemical use.
This is robotics colliding with real world economics.
Dangerous environments: inspection robots have been helping in dangerous environments for a long time. Now they gain autonomy, endurance, and better sensing.
Pipelines, mines, nuclear facilities, disaster zones. These systems now go where humans should not go at all. Not just where is inconvenient.
And have you heard the latest about Optimus? Watch this short episode when Elon Musk said how many Optimuses and how fast they going to deploy.
It's Ilan Musk. He's rarely on time, but he always delivers.
I mean, there's a million of these things to figure out, but who's going to have access to the first Optimus that does far far better micro surgery than any surgeon on Earth, but you've only manufactured the first 10,000 of them? How do you do it out?
I don't think people understand how many robots there's going to be.
Well, there's a window of Saudi said 10 billion by 2040. You still on that path?
That's a low number. This is very exciting time.
And I'm passing the mic back to Jensen Hang who has always something interesting to say and he will be perfect closing for this episode. Thank you for watching. Let me know what you think about robotics. Remember to like, subscribe, share and leave your comments. Humanly, yours from Turing Post.
We're developing technology in that area and I know the rest of the industry is doing so as well. Locomotion is hard, but that's making incredible progress.
And so I think locomotion is the first gets all gross articulation and grasping will be the second and third will be fine motor skills in that kind of not in that sequence but in the degree of difficulty the rest of it cognition skills is evolving very quickly as you know and so we're going to have really really great reasoning AI models that are resonant inside the robot and so the robot will be able to reason very quickly and anything that it wants to you tap the AI in the cloud for additional knowledge you can.
So we you're going to see some pretty amazing things.
And then the one thing that I will say in addition to that is these AI models are humanoid in nature.
But don't forget when you you're humanoid, but when you sit inside a car, somehow you embody the car. You're able to steer a car like it's an extension of you.
When you picked up a tennis racket, you embodied the tennis racket. Somehow your arm got longer.
And so the ability for AI to become multi-environment, meaning we train an AI model to be a humanoid, but it turns out it's a perfectly good manipulator. It's a perfectly good self-driving car.
That day is probably going to come, too. And so I think that the next several years is going to be really exciting.