If the job description listed “must remove all sense of legal duty, moral ethics, and humanity” as responsibilities, would you apply? According to a new report, that’s essentially the role of a test driver for Tesla.
Self-driving vehicles are the future. If tech companies have their way, autonomous cars would be the only future. Many of us live in reality, though, and understand that the software is nowhere near where it should be for that sci-fi level of automation. But the folks with the funds will keep trying and speeding up the process, even if it means beta testing on public roads with all of us as the guinea pigs.
Business Insider has revealed details of a specialized team of Tesla test drivers dubbed Project Rodeo. This testing team is trained to push the limits of the automaker’s Full Self-Driving (FSD) and Autopilot software. What exactly is the limit? Crashing. But the rule of thumb seems to be to get as close to colliding with something or someone. The scarier the situation, the better.
“You’re pretty much running on adrenaline the entire eight-hour shift,” said one former test driver. “There’s this feeling that you’re on the edge of something going seriously wrong.”
Nine current and former Project Rodeo drivers and three Autopilot engineers from California, Florida, and Texas were interviewed. Most asked to remain anonymous. The situations they describe are eye-opening but not surprising. Although FSD-related crashes have been well documented, none of those interviewed were involved in one.
Project Rodeo is a test group comprised of smaller teams. For example, the “golden manual” team drives by the book, follows the rules of the road, and doesn’t use any driver-assistance features. The opposite end of that spectrum is the “critical intervention” team. More riders than drivers, critical intervention testers let the software handle all aspects of the drive. They get involved or “intervene” only to prevent a collision.
Part of the reason test drivers wait until the 11th hour to manually take control is because it gives the software time to react to make the right or wrong decision. The more data collected, particularly in real-world scenarios, the easier it is for engineers to adjust and update the software.
“We want the data to know what led the car to that decision,” said a former Autopilot engineer. “If you keep intervening too early, we don’t really get to the exact moment where we’re like, OK, we understand what happened.”
What this leads to, however, is vehicles being allowed to run red lights, cross double yellows, fail to heed stop signs, and exceed speed limits—all on public roads. Even if a situation becomes uncomfortable for the driver, supervisors would say they took over too soon. Thus, Project Rodeo drivers, even those in non-critical intervention roles, felt pressured to maintain the risky driving situations or sometimes create them altogether to test the software to keep their jobs.
John Bernal, a former test driver and data analyst, said he was never told to deliberately break the law for data collection, but it was strongly implied. “My training was to wait until the wheels touched the white line before I could slam on the brakes,” he said.
On top of that, certain drives were used solely to train the software to recognize and adjust to “vulnerable road users,” such as pedestrians, cyclists, and wheelchair users. While driving with his trainer, a former tester said their vehicle came within three feet of a cyclist before he hit the brakes.
“I vividly remember this guy jumping off his bike,” he said. “He was terrified. The car lunged at him, and all I could do was stomp on the brakes.” Apparently, his trainer was actually happy about that, telling him that his late reaction was “perfect” and exactly what they wanted him to do. “It felt like the goal was almost to simulate a hit-or-miss accident and then prevent it at the last second.”
Cruise and Waymo are also developing self-driving cars but say they conduct rigorous software testing in controlled environments or feel their autonomous systems are “fundamentally different” from Tesla’s. Hmm, then why do these companies experience the same issues with vehicles not reading the room, per se? In the case of Uber’s now-shuttered self-driving division, sometimes the results are deadly.
“If you have a parent that’s holding the bike the entire time, it never gets to learn,” said a former Tesla engineer. Ultimately, data is king. For these autonomous tech companies now at the mercy of shareholders, it’s a high-risk, high-reward environment that the public didn’t sign up for.
Got tips? Send ’em to [email protected]