> it does raise serious questions as to is the models used for autonomous cars could also "hallucinate" and do something stupid "on purpose"...
It doesn't because Tesla's FSD model is just a rules engine with an RGB camera. There's not "purpose" to any hallucination. It would just be a misread of sensors and input.
Tesla's FSD just doesn't work. The model is not sentient. It's not even a Transformer (in both the machine learning and Hasbro sense).