By: Bryan Tropeano
You’d assume a car that can drive itself would have no problem navigating a simple detour, but that assumption was put to the test this week when a group of autonomous delivery vehicles ground to a halt in downtown Los Angeles.
The issue wasn’t a crash or bad weather. It was construction. A major road closure rerouted traffic through a maze of temporary lanes, cones, and handwritten signs. Human drivers crept through without much trouble. The self-driving vehicles, on the other hand, didn’t fare nearly as well.
Several of the cars stopped mid-route, confused by shifting lane markings and temporary signage that didn’t match what their maps expected to see. Some pulled over and waited. Others blocked narrow streets altogether, forcing traffic to funnel around them.
At first, it looked like the vehicles simply couldn’t improvise when the real world didn’t match their digital instructions. And to be fair, that’s not entirely wrong. The companies behind the vehicles have now acknowledged that the systems struggled with what they call “dynamic construction environments.”
In a statement, the company said its autonomous software is capable of handling road changes, but only up to a point. When conditions fall outside those parameters, the system can request human assistance to confirm the safest path forward. That’s exactly what happened here.
The problem was scale. With multiple vehicles hitting the same unexpected construction zone at once, the number of assistance requests quickly piled up. Human operators couldn’t respond fast enough, leaving some cars effectively frozen in place while they waited for confirmation.
This kind of human-in-the-loop system was designed as a safety net, not as a primary decision-maker. But when enough edge cases stack up, that safety net starts to look more like a bottleneck.
The company says it’s already working on updates that will allow the vehicles to better interpret temporary signage and construction layouts in real time. They’re also exploring ways to reduce reliance on manual intervention when multiple vehicles encounter the same scenario simultaneously.
It’s a reminder that autonomous vehicles, like Waymo, perform best in a world that behaves predictably. The messier things get, whether it’s power outages, construction zones, or improvised detours, the more those systems reveal their limits. For now, the future of self-driving still comes with a few asterisks.






