I always spoil as many of these as possible. Sometimes it takes me a while to prove that I'm human, but I'm dead-set on convincing it that I'm a stupid human. Of course, I fantasize that some day a robo-car will crash because I taught it that there's really no difference between a motorcycle and a flight of stairs.
I love this idea, some sort of inverse Roko's Basilisk. Tie a bunch of low-IQ data points to the sources a super AI is likely to first use to identify threats so as to eke out a few more days of existence.