To quote the source:
>Nightshade's goal is not to break models, but to increase the cost of training on unlicensed data, such that licensing images from their creators becomes a viable alternative.
Which feels similar to DRM. To discourage extraction of assets.