You can use older images, collected from before the "poisoning" software was released. Then you don't have to.
This, of course, assumes that "poisoning" actually works. Glaze and Nightshade and similar are very much akin to the various documented attacks on facial recognition systems. The attack does not exploit some fundamental flaw in how the systems work, but specific characteristics in a given implementation and version.
This matters because it means that later versions and models will inevitably not have the same vulnerabilities. The result is that any given defensive transformation should be expected to be only narrowly effective.