Watching Robotaxis Move from Desert Tracks to City Streets

Watching Robotaxis Move from Desert Tracks to City Streets

I remember the late nights in the early 2010s, poring over grainy videos of DARPA Grand Challenge vehicles crawling through the desert, wondering if we'd ever see self-driving cars in everyday traffic. Back then it felt like science fiction.

Now the partnership between Nvidia and Uber has brought that future a little closer. They've announced plans to deploy robotaxis in Los Angeles and San Francisco during the first half of 2027, with the goal of reaching 28 cities around the world by 2028. The cars will run on Nvidia's technology, and for someone who's followed this space for years, it lands as another concrete step rather than a sudden leap.

The Details Behind the Rollout

Uber is bringing Nvidia's DRIVE Hyperion platform into its fleet, complete with the Alpamayo AI model that uses chain-of-thought reasoning to navigate messy real-world situations like construction zones or unpredictable pedestrians. This is Nvidia stepping beyond chips into full-stack Level 4 software.

The plan is careful and staged: first they'll collect data on local roads, then run operations with safety operators, and only later move to completely driverless service. A shared AI data factory using Nvidia's Cosmos platform will sort through both real and simulated driving data to keep improving the system.

On the hardware side, the DRIVE AGX Hyperion 10 setup offers serious computing power—more than 2,000 FP4 teraflops—along with support for cameras, radar, lidar, and other sensors. It is designed to receive over-the-air updates and meet the necessary safety standards.

Building on What Came Before

This builds on a collaboration the two companies started in late 2025, which aimed at deploying up to 100,000 vehicles over time. It also fits into Nvidia's work with a range of carmakers—BYD, Geely, Nissan, Stellantis, Lucid, Mercedes-Benz—who are using the same platform to prepare for Level 4 driving.

For Uber the move makes sense within a strategy that works with several autonomous-vehicle partners instead of relying on just one. In time, riders might have more consistent and widely available options alongside traditional drivers.

Nvidia's Jensen Huang described the moment as the "ChatGPT moment" for physical AI, where robots begin to reason through the chaos of the actual world. Uber's Dara Khosrowshahi focused on the potential for safer, more reliable transportation that more people can use.

The Practical Hurdles Still Ahead

I've watched enough cycles in this industry to keep expectations in check. Timelines for self-driving technology have shifted many times before, caught on safety questions, regulations, and the sheer complexity of city streets. Scaling to driverless service across dozens of cities on multiple continents will test every part of the system.

There are larger concerns too: what it means for drivers' jobs, how cities will adapt their roads and rules, the risks around cybersecurity, and whether the public will trust the cars. Nvidia has introduced the Halos Certified Program to let independent reviewers assess safety and security, which might help ease some of that doubt.

When you set this against what Waymo is already doing commercially in multiple cities, or the ambitions of Cruise, Tesla, and several Chinese companies, the Nvidia-Uber alliance stands out for combining strong hardware with Uber's huge ride-hailing network. How it performs in those first Los Angeles and San Francisco runs will tell us a lot.

Why It Feels Significant

On a quieter, more human level, reliable autonomous options could ease commutes, cut down on accidents, and open up mobility for people who don't drive. The partnership reflects the slow merging of AI with robotics and the physical environment that has been underway for some time.

It doesn't feel like an overnight revolution, but more like a practical foundation—taking today's most capable computing and software and applying them to the ordinary chaos of city traffic. Having followed the story since those early desert experiments, I'm guardedly hopeful.

The real test will come in how well the models learn from actual urban data, how safely the service scales, and whether the expansion happens responsibly. For the moment, it is one more sign that the way we move around is changing, one reasoned decision at a time.

Sources