Robotaxi Companies Are Hiding How Often Humans Secretly Take Over
Self-driving cars are on public roads right now — but how often does a hidden human operator quietly step in to keep them from failing? That is the question a U.S. Senate investigation set out to answer. The result: every major robotaxi company refused to tell the truth. Here is what we know, why it matters, and what could change next.
| Credit: Stan Grossfeld/The Boston Globe / Getty Images |
A Senator Demanded Answers. The AV Industry Went Silent.
In February 2026, Senator Ed Markey sent formal letters to seven of the biggest names in autonomous vehicle technology — Aurora, May Mobility, Motional, Nuro, Tesla, Waymo, and Zoox. He asked each company a list of 14 pointed questions about their use of remote assistance operators — the human staff who monitor and sometimes guide self-driving vehicles from a distance.
The findings, released this week, were striking. Not one of the companies directly answered how often their remote operators are called into action. Waymo and May Mobility went further, explicitly labeling that information as "confidential business information." Tesla did not even include the question in its response letter. The reason remains unexplained.
This is not a minor technical footnote. These vehicles operate on public streets, carrying real passengers, in cities across the United States. The public — and regulators — arguably have a right to know how much human intervention is still propping up this technology.
What Remote Operators Actually Do Inside Robotaxis
To understand why this matters, it helps to know what remote assistance operators do. When an autonomous vehicle encounters a situation it cannot handle — an unusual traffic pattern, a blocked road, an unexpected pedestrian movement — it can send a signal requesting guidance from a remote human agent.
That person, sitting in a monitoring center sometimes thousands of miles away, can then provide input to help the vehicle navigate the situation. In Waymo's case, its chief safety officer revealed during a Senate hearing that roughly half of these remote staff members are based in the Philippines. The company insists those workers hold local drivers' licenses, but Markey's office pushed back, noting that a foreign license does not reflect knowledge of U.S. road rules.
This raises a legitimate concern. The skills required to safely advise a vehicle navigating a San Francisco intersection may not translate from someone whose driving experience is rooted in a different country, with different laws and road norms. It is a gap that no federal standard currently addresses.
The Transparency Problem That Has Followed Autonomous Vehicles for Years
Autonomous vehicle companies have long been reluctant to share operational data. During the early testing years, that hesitancy was somewhat understandable — the technology was experimental and the competitive landscape was fierce. But the calculus has shifted.
Waymo now offers commercial robotaxi rides in multiple U.S. cities. Aurora has begun operating self-driving semi-trucks on public highways. These are no longer prototype experiments. They are live commercial deployments, carrying passengers and cargo, interacting with ordinary drivers and pedestrians every single day.
Yet the information gap persists. Markey's office described the investigation as revealing a "patchwork of safety practices" across the industry — with significant variation in operator qualifications, response times, and staffing locations, and no federal standards governing any of it. The word used in the official report was "stunning."
What the Companies Did and Did Not Reveal
Despite refusing to share frequency data, the companies' responses did contain some useful fragments of information.
Waymo acknowledged that improvements to its self-driving software have reduced the number of remote help requests per mile over time. It also noted that a large majority of those requests are resolved by the vehicle's own system before a remote agent even responds. But it provided no figures, no timelines, and no independent verification of these claims.
On the question of direct vehicle control, every company except Tesla said their remote operators cannot directly control the vehicles. Tesla was the outlier. It confirmed that its remote operators are authorized to take over direct control as a last resort — but only when the vehicle is traveling at two miles per hour or slower, and control cannot exceed ten miles per hour. The stated purpose is to help move a vehicle that is stuck, without waiting for a field technician or first responder to arrive in person.
The investigation also surfaced data on system latency — how long it takes for a remote operator's input to reach the vehicle. May Mobility reported the worst-case figure among the group, at 500 milliseconds. That is half a second of delay in a situation that may already be urgent.
Overseas Staffing and Safety Standards Raise New Questions
The revelation that Waymo employs remote assistance workers in the Philippines deserves more scrutiny than it has received. Waymo was the only company that admitted to overseas staffing in response to Markey's inquiry.
The company maintains that these workers are qualified and that the remote assistance role does not require the same judgment as physically driving a car. There is a reasonable argument there. But the counterargument is equally reasonable: the people guiding these vehicles through real emergencies should understand the legal and physical environment those vehicles are operating in.
Markey's office made the same point explicitly. A driver's license issued in another country is not evidence of familiarity with U.S. traffic law. And without federal standards defining what qualifications remote operators must hold, companies are essentially setting their own rules.
What Happens When a Robotaxi Gets Stuck?
One of the more visible frustrations with today's robotaxi deployments is the "frozen vehicle" problem. A robotaxi that encounters something it cannot interpret — a construction zone, a downed tree, a confusing merge — can sometimes simply stop and wait. If no remote operator intervention resolves the situation quickly, emergency services may be called.
Waymo faced pointed criticism from San Francisco city officials this month over its reliance on first responders to move immobilized vehicles. The company does operate a separate roadside assistance team for physical recovery situations, but that function is distinct from the remote operator role examined in Markey's investigation.
Tesla's approach — allowing limited low-speed remote control to prevent exactly this scenario — is notable in that context. Whether it is safer or riskier than other approaches is a question that deserves open, evidence-based examination. That examination is currently impossible without the data.
Federal Regulation Is Coming, But the Clock Is Ticking
Senator Markey stated this week that he is calling on the National Highway Traffic Safety Administration to launch a formal investigation into how autonomous vehicle companies use remote assistance workers. He also indicated that legislation is in development, aimed at imposing "strict guardrails" on these operations.
What form that legislation takes will matter enormously. At a minimum, advocates say it should require companies to publicly report remote operator intervention rates, establish baseline qualification standards for remote staff, clarify whether overseas staffing is permissible, and set latency thresholds for remote assistance response times.
The autonomous vehicle industry has consistently argued that over-regulation could slow innovation. That is a familiar argument, and it has genuine merit in early research phases. But when the technology is deployed commercially — when real people are riding in these vehicles on real roads — the bar for transparency must shift accordingly.
Public Trust and the Future of Self-Driving Cars
There is something quietly significant about the fact that every company in this investigation declined to share the most basic operational metric about their remote support systems. It is not just a regulatory gap. It is a trust gap.
The promise of autonomous vehicles has always rested on the idea that software can make transportation safer, more efficient, and more accessible than human-driven alternatives. That promise is worth pursuing. But it cannot be built on opacity. If these companies genuinely believe their technology is safe and their operations are sound, the most powerful thing they could do is prove it with data.
The public is already sharing the road with these vehicles. The least the industry can offer in return is honesty about what is happening inside them.
With more commercial deployments planned across the country, and federal scrutiny now firmly engaged, the era of voluntary disclosure has likely run its course. The next chapter will be defined not by what these companies choose to reveal — but by what they are required to.