Waymo Recalls Self Driving Taxis After Reported School Bus Incident
Waymo is under fresh federal scrutiny after a reported incident involving one of its fully driverless robotaxis and a stopped school bus in Atlanta. What began as a preliminary probe into roughly 2,000 self driving vehicles has now grown into an official recall that covers 3,067 Waymo taxis equipped with the company’s 5th Generation Automated Driving System. At the heart of the issue is whether the software handled one of the most sensitive situations on the road correctly, a school bus with children getting off.
According to investigation documents, the incident took place on September 22, 2025. A Waymo taxi operating without a human driver approached a school bus that was stopped with red lights flashing and both the stop sign and crossing control arms extended. Reports say the robotaxi initially came to a halt beside the bus, then proceeded to drive around the front of it and along the opposite side while students were disembarking. For regulators and parents alike, that is the kind of behavior that immediately raises alarms.
The National Highway Traffic Safety Administration’s Office of Defects Investigation opened a case focused on Waymo’s fifth generation Automated Driving System to understand how the vehicle made that decision. In its recall filing, the agency points to software logic that could allow the taxis to pass stopped school buses even when the red lights are flashing and the stop arm is extended. Waymo says the problematic software build was installed on November 5 and that a corrective update was deployed across the affected fleet by November 17, effectively patching the issue over the air.
Waymo, for its part, has acknowledged the investigation and says it has already made changes to improve how its vehicles behave in similar situations, with more refinements planned. A company spokesperson explained that in the Atlanta case, the school bus was partially blocking a driveway that the robotaxi was trying to exit, and that the vehicle did not have a clear view of the bus’s warning lights or stop sign. That explanation may help frame the technical challenge, but it does little to ease the broader concern that an automated system still misjudged a scenario where human drivers are expected to exercise maximum caution.
For the wider autonomous driving industry, this recall is another reminder that public trust will hinge on how these vehicles behave in the most vulnerable traffic environments, not just how smoothly they handle highway cruising or downtown gridlock. School zones and school buses are non negotiable safety zones in the eyes of regulators and the public, and any misstep there will attract outsized attention. As Waymo and its rivals push ahead with driverless services in cities across the country, they will need to show that their software can consistently outperform a conscientious human in exactly these high stakes, low margin situations.
