https://commons.wikimedia.org/wiki/File:New_Holland_TC30_HST...
and even it has a sensor in the driver's seat that will stop it if there is less than 100 lbs in the driver's seat. (When my son weighed 80 lbs he couldn't drive it unless I put 2 10 lb weightlifting plates and put them in a backpack.)
The purpose there is so that the tractor doesn't drag or crush you if you fall out of the seat, but that kind of sensor is also commonly used in the passenger seat of cars to scale the force of the passenger side airbag to the size of the occupant.
What could Tesla possibly do to provide more safegaurds? Measure resistance on the steering wheel to prove it is flesh on there? Driver-facing camera? Weight sensor in the seat? Each of these has their own issues including defeatability and privacy invasions.
Anything can be defeated eventually. However any attempts to be defeated should automatically put the liability on the person making that attempt. Example: It is not Samsung's fault if I intentionally disable safeguards and over charge my battery causing a fire or explosion.
Because other companies do it better. Tesla likes to publicly position its driver assistance as being so good that they refer to it as "Full Self Driving", and yet the system doesn't include a number of basic safety measures. That's part of the reason why they tell regulators that "Full Self Driving" is only a level 2 system:
https://www.thedrive.com/tech/39647/tesla-admits-current-ful...
> Measure resistance on the steering wheel to prove it is flesh on there? Driver-facing camera? Weight sensor in the seat?
Yep.
> Each of these has their own issues including defeatability and privacy invasions.
Tesla cars in general are privacy invasions. For example, a recent car crash resulted in Musk tweeting about what Tesla's logs supposedly reveal about the car before it crashed. Tesla is quite happy to rummage through your data and use it to protect themselves. In this case it earned Tesla a search warrant:
https://www.thedrive.com/tech/40250/elon-musk-denies-autopil...
No, Tesla very clearly says that full self driving is a future feature and not one that is available today (https://www.tesla.com/autopilot). All it says on there is that the hardware is present to enable it. There’s also a disclaimer on there:
> Current Autopilot features require active driver supervision and do not make the vehicle autonomous.
They spell this out even more clearly elsewhere in their documentation. The FSD Beta release notes explicitly tell users that it can do the wrong thing at the worst time, that they need to pay extra attention, and that they shouldn’t get complacent (https://www.newsweek.com/tesla-full-self-driving-beta-releas...).
I do agree Tesla’s practices seem like they’re normalizing massive privacy invasions. But I really don’t think the company is at fault if irresponsible drivers use their vehicle in an incorrect manner, which they’ve been abundantly warned against.
I see no reason to believe Musk, a serial liar with a massive incentive to lie here.
He even hedged by saying "data recovered so far" doesn't show Autopilot engaged.
Either way one thing is rather certain, stupid humans where involved and are most likely the sole reason for the accident regardless of which story turns out to be true.
The Consumer Reports people went out of their way to bypass Tesla’s existing safety features, by placing a fake weight on the wheel (a heavy chain) and by buckling the seat belt without anyone sitting there. So what - it’s a contrived experiment.
Lots of people in social media seem to hate Elon Musk and are using this as some sort of indictment of him or Tesla. I don't see anything particularly interesting or outrageous about this. People can misuse cars already, and they will come up with clever workarounds in the future as well. It’s not exclusive to Tesla. It’s not even exclusive to cars. You could misuse a kitchen knife as a toothbrush but that doesn’t mean Cutco need to be regulated.
The articles won't be flagged as such because they are neither misinformation nor fake news.
That is misinformation. Facebook and Twitter have flagged articles for much less, because they felt the headlines were misleading or incomplete.