Elon Musk actively promotes his Tesla cars’ upgraded level-2 ADAS suite, available for an extra USD $15,000, as “Full Self-Driving”, a moniker the world’s regulators and watchdogs are increasingly calling wholly inaccurate. Musk also scorns lidar as “a fool’s errand” and mocks its proponents—the entire rest of the ADAS · AD · AV world—as “doomed” and “losers”. Yet lidarless Teslas are driving themselves and their occupants into harm’s way in kinds and severities of crashes not seen in more realistically-equipped and -configured cars. What’s going on?
Clearly, Tesla haven’t marketed anything like a fully self-driving car. Just as clearly, Musk is eager to offer (or at least be perceived as offering) a ‘full self-driving’ car. He’s been promising full self-driving “next year” for at least nine years. He’s also keen to achieve this (or be perceived as achieving it) at the lowest possible price; that would explain his singular scorn of lidar, as well as his latest decision to delete the ultrasonic sensors from new Teslas. He’s admitted that (real) full self-driving is “really the difference between Tesla being worth a lot of money or worth basically zero”.
That means he’s betting Tesla’s fortunes on using questionable software blinkered by inadequate, outdated hardware to solve the huge challenges of autonomous driving. A prime example is the cameras mounted in the B-pillars. Located behind the driver, these are what Tesla’s ‘Full Self-Driving’ software uses to see traffic crossing ahead, so it can make judgement calls during one of the riskiest driving manœuvres: crossing or merging into fast-moving traffic. This is challenging for an experienced human driver, but it is abjectly hazardous for Tesla’s ‘Full Self-Driving’ software, saddled as it is with under-equipped, underspecified hardware.
Imagine this common situation: you come to an intersection with a stop sign and need to get across to continue straight, or you need to turn across traffic into the far lane. The cross traffic has no traffic control, so you’ll need to cross and/or join a fast-moving stream of vehicles from a stop. To do this safely requires a vantage point that allows you to see a good distance in both directions, so you can spot and judge the adequacy of a gap that will allow you to cross and/or join the traffic stream.
Often in these situations an obstruction—garbage cans; a bus stop; parked cars, or something else—means you must move forward for a better view, and crane your neck to look left and right. A human driver can easily adjust themselves this way, and so can make an accurate decision on when it is safe to go.
Tesla’s B-pillar cameras are mounted between the front and rear doors on each side, about 20 cm further to the rear of the car than the driver’s eyes. Musk bragged about the positioning of these cameras at Tesla’s Autonomy Investor Day in April 2019: “The cameras in the car have a better vantage point than a person”. This is clearly not the case—a human driver can lean forward to reposition their eyes so as to see what must be seen, but Tesla’s ‘Full Self-Driving’ software can’t; it’s stuck with the physically-hindered viewpoint of the cameras. So it is physically unable to dependably see oncoming traffic, let alone accurately judge its distance or speed. Cars with ‘Full Self-Driving’, therefore, cannot safely navigate this type of intersection. They’re terrible at it, because the software lacks the kind and amount of input needed to determine whether it is safe to go. And that’s under the best of conditions; the B-pillar cameras have even bigger problems with certain sun angles. It is as if a human driver were tightly duct-taped to their seatback, with their head locked straight forward and the car’s left and right windows painted black, and the car’s sunvisors removed.
Musk has promised that any Tesla bought after 2016 would be capable of full self-driving without hardware upgrades, but a bit of scrutiny shows this cannot possibly be true. There is very little that could realistically be done to fix the obvious inadequacy and clear safety risk posed by the inadequate, poorly-configured hardware. Tesla would need to either move the existing cameras much further forward or install completely new ones. This would be a huge, costly operation that would require drilling holes in the car and reworking the electronics—likely not feasible.
This is a fundamental and awful error from Tesla, one of many I’ve catalogued and analysed. The B-pillar camera problem should have been obvious to Tesla’s engineers from an early stage. Perhaps it was, but the concern was overruled or went unvoiced;
Whatever the reasons why the B-pillar cameras are what and where they are, they’re just not adequate to the task. They are an exemplar of numerous other shortcomings; flaws, and failures in Tesla’s ‘Full Self-Driving’ product—hardware and software alike. Everybody whose opinion is relevant agrees that redundant hardware and robustly errorproof software are inescapably necessary for any amount of real, safe self-driving. Musk is the only dissenter; he insists he’s right and the whole rest of the world is wrong, but there’s a growing pile of crash debris—car parts and people-parts alike—demonstrating he’s wrong about that, too.
Dan O’Dowd is the Founder of The Dawn Project, a public safety advocacy group campaigning to make computers safe for humanity. He is a world-renowned expert in creating software that never fails and can’t be hacked, and created secure operating systems for projects including Boeing’s 787s; Lockheed Martin’s F-35 fighter jet; the Boeing B1-B intercontinental nuclear bomber, and NASA’s Orion crew exploration vehicle.
O’Dowd is a worldwide authority in embedded safety and security, creating safe and secure real-time operating systems to support a broad range of hardware and software platforms in multiple industries including avionics; self-driving cars, and remotely-controlled medical equipment.
The security of his operating system has met the highest standards of the Federal Aviation Administration, as well as those of the National Security Agency and the National Institute of Standards and Technology—the latter a certification no other company in the world has yet achieved.
DVN recently reported on O’Dowd’s Super Bowl advert showing the poor safety performance of Tesla’s ADAS suite.