A Tesla Crash: Why Autopilot does not mean autonomous and how your Roomba vacuum offers a similar level of ‘crash protection’
I’m writing this post as a build on my previous post Don’t Forget how to Drive from 2015, regarding autonomous driving. I recently had the unfortunate opportunity to analyze a Tesla Model S Autopilot crash up close. No one was injured in the crash, but it could have turned out much worse. My dad recently purchased a Tesla, due to my constant prodding. He drove hybrids before, and is a die-hard believer in electric cars, but with the lack of exciting electrics in the market he waited to jump in. He loves his Tesla, but this recent crash has made him contemplate and reflect on the state of autonomous driving capabilities. It’s left him disappointed. What you see in this video, which Tesla posted last month to their social media channels is only pseudo-accurate - with any new technology there are more than a few caveats, and when it comes to the Autopilot feature, there are more than a few that are not fully fleshed out yet. His disappointment comes due to expectations. A product that didn’t live up to it’s promised features. I’m not debating that the Model S is not a state of the art electric car. It is. But it’s autonomous features have been oversold.
Misaligned expectations between Tesla and it’s customer base have been a hot button area since the company’s inception. In the case of the Tesla Roadster people put down $100k and expected the delivery date to be early 2006. However, the first customers received their vehicles in mid, to late 2008. Some were not so happy about this.*
The Accident
Tesla in the middle lane, Autopilot on, signals a lane change with the left blinker to move into the left lane. The vehicle begins to move over, is hit by oncoming car on the fender (front left quarter panel), clips the mirror, and skims the front bumper, before driver (my father) veers back into the middle lane. The Autopilot field of vision in the front of the vehicle is close to 50 meters (in distance), however it is less than 5 meters in the rear of the car. This means managing tricky scenarios in front of you, piece of cake, but a car that is darting between lanes in back of you, not so easy to remedy. Because of the rate of acceleration of the oncoming vehicle and the limited distance of view behind the Tesla, it creates a dicey situation when the Autopilot is engaged, although it’s stated in Tesla’s video “changes lanes with a single tap.” On the surface, that’s true, however the algorithmic decision the vehicle decides to make rests on the power of it’s sensors. Additionally, the side mirror doesn’t have an expanded (angle) view, as many cars do today. Surprisingly, the other vehicle (also a luxury car) was heavily damaged on both of it’s passenger doors, while the Tesla only has a few ‘dings and scuffs’ - a plus for vehicle rigidity.
The natural response from Tesla (an 8 billion dollar company) to this situation is likely to be, “well, we’re still a startup and we’re working out the kinks.” This mentality is plausible when a new app on your iPhone keeps crashing - it’s not going to kill you. However, when you haven’t refined a driving assistance program (that some critics contend, has over-promised features), that is going to be engaged on the highway at high speeds, you’re dealing with life and death situations.
Surprisingly, even the Nest thermostat has a larger field of vision and distance with it’s proximity sensors:
Nest uses Motion, Proximity, and Heat sensors. It has 2 main motion sensors, near field and far field.
The Near field sensor has a range of 6 feet, and the Far field sensor has a range of 25 feet. The motion sensors detect within a 170 degree angle off the front off the sensor window. The range will vary depending on light level and where Nest is installed.
This essentially means, Nest sensor technology, although not ultrasound has a greater range than the rear of a Tesla Model S. That’s alarming.
Although, I can’t confirm specific figures, the ultrasound and IR (infrared) sensors on robotic vacuums generally have a proximity range of 2.5 to 3 meters, and in the case of iRobot’s Roomba Lighthouse, it might have more. Point is, the Tesla’s rear is about as vulnerable to a crash as your Roomba is when it’s on a collision course with your couch. And not nearly as entertaining as watching your cat get chauffeured around your living room, by it’s own personal autonomous feline UBER.
There is a case to be made that since the environment around your car is constantly changing due to road conditions, speed and a handful of other factors, there are some areas, such as degrees of vision which will be compromised. I agree with this, however the way in which Tesla’s Model S features have been billed seems in conflict with the actual abilities of the vehicle - a number of Tesla owners have been chatting about this online.
Why the NHTSA (National Highway Traffic Safety Administration) doesn’t go after Tesla, in similar fashion as the EPA did in the case of Volkswagen is obvious. The little guy gets away with flaws. Especially when you’re bringing something so new to the market that most government agencies don’t know how to handle regulations just yet. Just this week, the NHTSA announced that it considers the SDS (self-driving system), of Google’s self-driving car, a driver. A major step forward for autonomous driving, however, also cited via Wired, were numerous problems:
“The second new problem is that the feds don’t have the tools they need to test new systems. The rule concerning rear visibility, for example, requires that the vehicle “display a rearview image (of a specified area of certain dimensions behind the vehicle) to the vehicle operator.” NHSTA’s happy to say that the operator is in fact a pile of software, and there’s no talk of eyeballs here.”
Until vehicles, regardless of OS (operating systems) are speaking to one another, these mishaps will likely increase. And although it was technically the ‘car’s fault’ in the case of the Tesla accident, it is the driver’s property, which means it remains his or her responsibility and therefore, ‘the human is at fault’. Or will manufacturers step in and front some of the bill to insurance companies when this happens?
The real question isn’t, are we ready for autonomous cars, but are we ready for autonomous accidents?
Additional sources:
*Elon Musk, by Ashlee Vance
Principal Software Engineer - Ceph at IBM
9yI think this article is a bit alarmist and dismisses the driver's responsibilities. The fact is, based on your description, the accident would have also occurred if the driver were in control. He simply did not account for the approaching driver. Tesla is very up front that the driver needs to ensure it's safe to make the lane change. While it's true that the Tesla software will prevent some unsafe lane changes, it's hardly fair to blame Tesla Motors for this incident or to insinuate that the car should have prevented the collision (with it's 16' range on the ultrasonic sensors). If the approaching car was weaving in and out of the traffic lanes at high speed (relative to the Tesla) as you describe, it's unlikely a longer range detector would have helped anyway. There will always be limits to new technology, it's up to us mortals to understand and work with the limits, not curse the limit after we've failed. Aside: the nest likely uses passive infrared for it's 'long range' sensing, which is fine for a thermostat that only needs to know you're there. That tech doesn't work so well for a car that needs to understand the distance and approaching speed of another vehicle.
Good thing we weren't in the car ;)
Marketing Strategist | Brand Leader | Customer Advocate | Team Collaborator | Operational Excellence | Global Strategist | Digital Innovator | Former TriMark, Oracle, Kia, Nissan, Infiniti, Ford, PepsiCo
9yGreat POV, thanks for sharing. And yes I have experienced Tesla's auto feature.