Screenshot of one of the incidents of a Waymo robotaxi illegally passing a stopped school bus. | Credit: Austin ISD
While engineers dream of an unmanned future, their creations sometimes fail elementary tests of knowledge of the rules of the road. Waymo, the flagship of the robotaxi industry, has been at the center of a scandal: its autonomous cars systematically ignored stopped school buses. The result is a voluntary software recall and unpleasant questions about how much we are willing to trust artificial intelligence with the most valuable things.
Not a bug, but a "feature" of perception
The problem didn't come up yesterday. Back in September 2025, a TV channel in Atlanta showed a video showing a Waymo car illegally driving past a stopped school bus. This was not an isolated incident. The National Highway Traffic Safety Administration (NHTSA) has opened an investigation, which revealed disturbing details.
According to a letter from the Austin School District, since the beginning of the school year, there have been 19 different cases when Waymo robotaxis "illegally and dangerously" drove past buses with the alarm turned on and the Stop sign extended. In one incident, the car passed "literally a moment after the student crossed the road in front of her while the child was still on the roadway." What makes the situation particularly cynical is that five of these violations occurred after Waymo assured the district that it had released a software update that fixed the problem.
It seems that the algorithm considered the yellow bus with flashing lights to be nothing more than a decorative element of the urban landscape. The school district was forced to require Waymo to stop operations during drop—off and boarding hours, an unprecedented requirement for the tech giant.
Waymo's Reaction: Pride, statistics, and a "voluntary" Review
Waymo's response has become a classic example of corporate rhetoric mixing pride with repentance. Mauricio Peña, the company's chief safety officer, said: "We are incredibly proud of our outstanding safety record, which shows that Waymo has twelve times fewer accidents with pedestrians than with human drivers." However, according to him, the highest safety standards oblige to recognize when behavior can be improved. Therefore, the company made a "voluntary" decision to revoke the software related to proper braking and stopping in such scenarios.
The company claims that it has identified the cause of the failure and that subsequent updates will fix it. Fortunately, according to official data, no one was injured in these incidents.
Context: Millions of miles and growing ambitions amid doubts
This incident occurred against the backdrop of Waymo's impressive growth. By July 2025, her robotaxis had traveled more than 100 million miles, and the fleet continues to roll 2 million miles weekly. The company has already made more than 10 million paid trips and operates in several major American cities, including Atlanta, Austin and San Francisco. The expansion plans are staggering in scale, from Detroit and Miami to London and Tokyo.
However, the bus story is not the first review. In May 2025, Waymo had already recalled 1,212 robotaxis due to the risk of collisions with guardrails. Despite statistics showing a 91% decrease in the number of serious accidents, each such incident hits the main thing — public trust. If the system can't handle one of the most basic and emotionally charged rules on the road, how can you trust it in difficult, non-standard situations?
The future: not replacement, but integration (and new business models)
The incident highlights a fundamental problem: unmanned systems are not created in a vacuum. They will have to work in a chaotic world filled with unpredictable participants, created for people. It is impossible to "sew" all possible scenarios into the code. The future, most likely, will not be for the complete replacement of humans, but for complex integration, where AI will perform routine tasks under strict supervision and control.
Interestingly, as these "digital employees" become more complex, new services will appear to manage them. Even today, there are platforms that help companies not to buy, but to effectively "hire" robotic solutions for specific tasks, managing their "labor" as a service. This is the logical next step in the evolution of the market — from owning hardware to using intelligent capabilities on demand.
Conclusion: a lesson that is worth a lot
The Waymo recall is not a failure of the technology as a whole. It's a painful but necessary growth lesson. He reminds us that the path to an autonomous future is paved not only with millions of miles of test trips, but also with incidents that force us to review, refine and improve algorithms. The main question now is how quickly and efficiently the company, and indeed the entire industry, will be able to learn such lessons. After all, it's not just brand reputation that's at stake, but real lives on the roads that robots promised to make safer. So far, they have failed the examination for attention to the details of the human world with a bang.










