<https://en.wikipedia.org/wiki/Protection_of_Lawful_Commerce_...>
(Note that the bill passed with significant Republican and Democratic support.)
Which is to say, conventional notions of contributory negligence and liability existed until sufficient political power was accrued by the arms industry to write itself out of same. The cited Wikipedia article names several such suits, notably by the cities of Chicago, IL, and Bridgeport, CT.
Tighter regulation for AI is needed.
A better analogy would be some kind of "guns-as-a-service" model, where the company sends down a drone with a gun and fires it at whoever you point it at, then the drone flies back to base.
I think it would be very clear in cases like that that the service provider should be held liable.
In a lot of the world, yes, and in America we would as well if it weren’t for the modern take on the Second Amendment. AI has no similar legal purchase.
Bitey dogs.
Dangerous drugs and their users and purveyors. Heroin, weed, booze, coffee.
Things done while on drugs. Things done while insane.
Unhealthy food and its purveyors and consumers.
Social media and its "addicts". TV, any old media, and social panic.
The question "whose fault?" isn't simple.
Most of your examples have exquisitely-simple causation.
If you theoretically trained an AI on libel and had it set to libel anyone at the slightest prompt, then allowed users to make a request that had your AI on your server use your services to libel someone, I'm not really seeing how you would not be liable.