CR's video proving that Autopilot will work with no one in the seat.
-
Not that we really needed the proof, but I'm glad a trusted source like CR ran this experiment to turn up the heat on Musk & Telsa to make their systems safer (Darwin-award-finalist-safe).
Like a whitehat hacker publicizing a vulnerability after giving due warning, it forces the company to take action to fix it.
-
@davesaddiction People have been beating it with bottles of water jammed into the steering wheel spokes.
-
Yeah, or oranges (debunked?).
Like I said, it's obvious to anyone like us paying attention, but having CR do it raises the attention level to more "normal" people.
-
@davesaddiction Maybe this latest issue will make the government force them to change the name to something that makes sense instead of something that sounds good.
-
@davesaddiction Yet another great use for duct tape!
-
As stated in other threads, "SuperCruise", or something like it, is the appropriate term.
Elon should also be banned from ever using the term "Full Self Driving", and should be deleted from all marketing materials.
How long have they been pre-selling this capability as a "future option" on their cars? Have any of these buyers been refunded their money?
"
Three years ago[in 2015], Musk claimed that Tesla’s vehicles would be ready and able to completely drive themselves without any human interaction by 2017.Two years ago[in 2016], Musk announced every car made going forward would have the hardware necessary to facilitate this goal. Tesla has spent the years since advertising this impending breakthrough on its website as an easy add-on to the purchase of a new car, something that only required a few thousand dollars and a little bit of patience. Those promises have all since weakened, though. Muskrecently[in 2018] admitted that the company will need to upgrade cars already on the road with new hardware — specifically, a new AI chip — in order to endow them with full self-driving capabilities. (Even then, some in the industry believe Tesla’s cars lack a crucial piece of the autonomous puzzle.) Tesla missed Musk’s 2017 estimate for rollout of Full Self-Driving by at least a year [LOL]. And now, Tesla has dimmed the visibility of Full Self-Driving in general, raising questions about the company’s approach to one of its grandest goals."Overpromise and underdeliver! It's been working for them so far...
-
@davesaddiction And you can put a brick on the accelerator of a regular car...
Tesla is a mess in a lot of ways, but if anyone does this, I see it as being on the person who bypassed Tesla's safeguard rather than Tesla. This is not something that anyone could do accidentally.
-
@davesaddiction
GM's Super Cruise really brings the simplicity of Tesla's checking system home. Camera on the driver making sure they're looking out. Easy peasy. Of course it only works on highways, etc, but the point is to key on the driver and make sure they're being engaged (or at least acting like it) which the GM system does. Much harder to trick... -
@facw True, but the fact is that people are idiotic enough to do it (as that crash proves) and while other manufacturers (GM, Ford, etc) have FAR more robust safeguards to prevent it, Tesla has never bothered.
-
@davesaddiction Ive always thought the long pole in the tent toward achieving autonomous cars would be the legal nightmare of companies being responsible for all accidents if there is no driver. Been hilarious to me that tesla just skipped that part and did it anyways. Impossible for tesla to avoid being in legal hot water over this shit considering they market it as "full self driving" and "autopilot". The fuck is that supposed to mean if I can't sit in the back seat and nap while the car drives? While sure, don't trust the computer overlords just yet. But marketing would have the average lemming believe in their dear lord and savior elon musk.
-
@facw While this is true, it's a bit asinine to offer a service called "Full Self Driving" and then be surprised when people actually take you at face value.
-
@rctothefuture They don't offer such a service yet. And even if you think it does, the fact you have to sabotage the car to get it to work should tell you that your understanding is poor.
-
@rallydarkstrike It's still not clear what caused this crash. Elon (admittedly not the most reliable narrator) is still saying this car didn't even have Autopilot last I saw (he also says that even if it did, it wouldn't be able to engage on a street like this, though that claim seems more suspect).
-
@facw Seems Tesla likes to call it that though. Also, it's not really sabotagining. It shouldn't work if no one is in the front seat, as it's a flagrant safety miss. My Mini will chirp about a seat belt if I put more than a bottle of water on it and yet Elon can't put the same kind of sensor in his seats?
GM did it right, it works off your body being in position with your eyes open. The fact that someone could fall asleep while still holding the wheel of a Tesla is a scary enough proposition, let along being in a different seat altogether.
-
@facw said in CR's video proving that Autopilot will work with no one in the seat.:
And you can put a brick on the accelerator of a regular car...
Yes, of course. But it could be argued that Tesla is being negligent for making these systems as easily defeatable as they are. It's one thing if a Tesla driver kills themself by being stupid and reckless; it's another thing entirely if a Tesla driver kills another family because they are literally asleep at the wheel on the interstate because their car makes it so easy for them (and the company's marketing would have them believe there's probably actually little danger in such an action, regardless of what the owner's manual says).
-
@rctothefuture Your Mini will chirp at you as will the Tesla. But neither will stop the car from operating.
-
@davesaddiction Tesla does detect if you are asleep at the wheel. If you sabotage that detection system so you can get your shuteye while driving, then that dead family is definitely on you and not on Tesla.
-
@facw the difference is that I'm in the driver seat with a steering wheel, pedals, and have control. The Tesla sensor should stop you from driving, my sensor should just annoy me because I put a bookbag on my passenger seat.
The fact that these people can drive without that is dangerous and outright stupid. If I ever see someone doing this shit, I'm calling 911 and getting their ass pulled over. No one should be in an accident that could have been 100% preventable because we believe Tesla's are Full Self Driving vehicles.
-
I agree that the driver who defeats the safety system is to blame.
But Tesla is not completely without blame in some of these accidents/deaths. Their safety systems pale in comparison to other manufacturers, probably partially because of their desire to be seen as the leader in near-autonomous driving. Their recklessness on this front has led to tragedy.
-
@rctothefuture You can set the cruise control and hop out of the driver's seat of your Mini just as easily as a Tesla driver. The Tesla is actually better, in the sense that if you hop out of the Tesla's seat it would quickly detect you aren't there (unless you sabotage that detection). That is to say Tesla's sensor does stop you from driving. To say they should have to add a fancy computer vision thing to monitor you because a bad actor could intentionally undermine their existing system seems bizarre to me.
And yes, anyone who sabotages the system should get 911 called on them, and probably go to jail. But it's really not Tesla's fault that they decided to undermine critical safety systems.
-
@facw so Tesla should just sell a system that allows people to undermine their system with some tape and a bungee cord? You're shitting me right?
-
@rctothefuture Again, don't even need tape and bungie cords to undermine a cruise control. All around, you are surrounded by products with easily bypassed safety mechanism, or not protected at all. It seems totally weird that just for this one class of vehicle you demand an unbeatable system.
-
@facw Yeah, though wasn't he also trying to block them from getting the data from the car?
-
@rallydarkstrike Haven't read that. Wouldn't shock me, I tend to think Tesla is a pretty shitty organization. But even for Musk, it's more than a little bizarre to make such a claim if he can't back it up. It certainly seems premature say this was definitely an Autopilot problem at this point. I guess we'll see what the investigation turns up.
-
The difference is the perceived risk to the driver himself.
In a normal car, the perceived risk of setting the cruise and getting out of the driver's seat is massive.
In a Tesla, the perceived risk is minuscule (because of all the reasons stated earlier).