Waymo robotaxis face school-bus safety scrutiny as former chief criticises Tesla’s camera-only approach

taxi, nyc, new york, city, united states, urban, manhattan, taxi

Waymo’s driverless taxis face new safety questions after recent incidents in which the vehicles passed stopped school buses with their stop arms extended. At the same time, former Waymo chief executive John Krafcik has criticised Tesla’s reliance on cameras for its driver assistance system, calling the strategy “myopic”. The combination of fresh video evidence and sharp words highlights a core debate in automated driving: which sensors, rules, and safeguards best protect people outside the vehicle, especially children boarding or leaving a bus. The episodes also underline how real-world behaviour at low speeds and in everyday traffic can test the limits of automated systems as much as complex motorway scenarios.

taxi, nyc, new york, city, united states, urban, manhattan, taxi

The incidents matter beyond company rivalry. School bus stops create predictable, high-risk moments on local roads. Many US states require all lanes of traffic to halt when a bus displays red lights and a swing-out stop sign. Any vehicle that drives past, human-driven or automated, draws immediate concern from parents, schools and regulators. With driverless services expanding, these cases add pressure on operators to prove their systems consistently recognise and respond to school-bus warnings.

Sensor strategies under the spotlight

Krafcik’s “myopic” remark focused on Tesla’s choice to rely on cameras for its advanced driver assistance system, marketed as Full Self-Driving. Tesla’s system remains a Level 2 driver assistance technology, which means the human driver must stay alert, keep hands on the wheel, and take full responsibility at all times. Tesla removed radar and ultrasonic sensors from new vehicles in recent years and emphasised camera-based perception supported by software and neural networks.

Waymo uses a different stack for its driverless taxis. Its vehicles combine cameras with lidar and other sensors to build a view of the road. Lidar uses laser pulses to measure distance and shape with high precision. Waymo also maps its operating areas in detail and runs its service within defined zones. The two approaches reflect a wider industry split over redundancy. Some developers argue more sensor types reduce blind spots and improve performance in low light or glare. Others argue high-quality camera data and advanced software can achieve the same goals with a simpler, cheaper hardware set-up.

School-bus laws and why they matter to automated vehicles

In most US jurisdictions, drivers must stop for a school bus that has its red lights flashing and stop arm extended. The rule aims to protect children crossing the road. Exceptions are limited, such as when a median physically divides opposing traffic. For automated vehicles, this scenario tests two tasks at once: recognising a specific vehicle state and applying a strict rule that prioritises safety over traffic flow.

Videos in recent days have shown Waymo vehicles moving past stopped school buses. The law in many places treats that as a violation when the stop arm is visible and red lights flash. These clips have led to questions about perception, rule interpretation, and the logic that decides when a driverless vehicle should yield. Developers typically encode explicit behaviours for school buses and train their systems on large datasets. Still, unusual angles, partial occlusions, poor weather, and complex urban layouts can challenge recognition.

Oversight, reporting and recent safety interventions

US federal and state regulators continue to monitor automated driving closely. The National Highway Traffic Safety Administration (NHTSA) collects crash reports from companies that test or deploy automated systems on public roads. In California, the Department of Motor Vehicles (DMV) regulates testing and deployment and can suspend permits if it finds safety risks or inaccurate reporting. These frameworks aim to ensure companies investigate incidents, update software, and communicate with authorities when systems fail to meet expected standards.

Recent enforcement actions show regulators will step in when safety falls short. In 2023, California halted driverless operations by a separate provider after a serious pedestrian incident, prompting a wider review of company processes and technical controls. While that case did not involve school buses, it underscored how city and state officials can limit or revoke access if they find a risk to public safety. The new school-bus videos will likely feed into the ongoing scrutiny of how automated vehicles behave around vulnerable road users.

Operational design domains and service limits

Automated vehicles work within an operational design domain, or ODD. The ODD defines where and when the system can drive without a human at the controls. It covers mapped areas, road types, speed limits, weather conditions, and specific scenarios such as construction zones or emergency scenes. Waymo operates driverless ride-hailing in selected US cities under local permissions and within an ODD that the company says matches its capabilities.

Even with a defined ODD, companies must address edge cases that occur within those limits. School-bus stops fit that description: they happen often, but the exact location and timing can change daily. Developers typically combine object detection with rule-based responses to handle these cases. They also collect real-world data and update software as they see new patterns. Incidents that involve school buses may prompt further refinements, including stricter geofencing around active stops, updated detection models for stop arms and lights, or conservative rules that force an immediate halt when a bus appears ahead.

What this means

For parents, schools and local authorities, these incidents reinforce a simple message: treat driverless vehicles like any other road user and watch closely at bus stops. Riders who use driverless services can expect companies to review and update their systems to handle school-bus scenarios more cautiously. Developers face continuing pressure to validate detection and decision-making around vulnerable road users, including children. Regulators have clear pathways to request data, review logs, and demand changes if they see gaps in performance.

For Tesla owners, the debate over cameras versus multi-sensor systems does not change a core requirement: Full Self-Driving remains a driver assistance feature, and drivers must supervise it at all times. For Waymo and other operators, the school-bus clips highlight the need to highlight the need to demonstrate consistent, conservative behaviour in situations where the rules are unambiguous and the stakes are high. Passing a stopped school bus is a clear safety failure in most jurisdictions, regardless of whether the vehicle is human-driven or automated. As driverless services expand into more neighbourhoods, their acceptance will depend less on how they handle rare edge cases at speed, and more on how reliably they respond to everyday risks that communities already understand and expect drivers to respect.

More broadly, the episode brings the sensor debate back into practical focus. Cameras, lidar, radar and maps are not abstract engineering choices when they shape how a vehicle reacts around children, pedestrians and cyclists. Each approach carries trade-offs in cost, complexity and redundancy. Incidents like these provide regulators and the public with concrete examples to judge whether those trade-offs are being managed responsibly.

In the near term, companies operating automated vehicles are likely to review their handling of school-bus encounters, adjust software logic, and share findings with regulators. For policymakers, the cases offer a real-world test of whether existing oversight frameworks are sufficient to catch and correct safety gaps quickly. For communities, they reinforce the expectation that any vehicle using public roads must follow the same protective rules designed to keep children safe.

When and where

The incidents involving Waymo vehicles and comments from former Waymo chief executive John Krafcik were reported in January 2026, based on video evidence and public statements covered by US technology and transport media, including Reuters and The Verge.

Author

  • Jeremy Jones Automotive Industry Reporter

    Jeremy Jones is an automotive industry reporter covering manufacturer announcements and transport regulation.