>Nightshade's goal is not to break models, but to increase the cost of training on unlicensed data, such that licensing images from their creators becomes a viable alternative.
Which feels similar to DRM. To discourage extraction of assets.
Sure. Just like how video game drm impacts performance and watermarks on images degrades the image. Drm walks a tight line that inevitably makes the result worse than a drum-free solution, but also should not make the item completely unconsumable.
So, do you want to define drm by intent or technical implementation? I'm doing the former, but it sounds like you want to do the latter. Also keep in mind that legalese doesn't necessarily care about the exact encryption technique to be deployed either.
Both. Changing an image is done all the time prior to publishing them. In fact, no image you ever see on the internet is a raw sensor output. They are all modified in some manner. The images processed using this method look the same to every person and computer that views them. That’s very different from DRM which encrypts things and prevents access to unprivileged users.
This is effectively the equivalent of someone doing really crappy image processing. As other commenters have mentioned, it does alter how images look to humans as well as machines, and it can be “mitigated” through additional processing techniques.
>That’s very different from DRM which encrypts things and prevents access to unprivileged users.
Well you can call it a captcha if you want. The point here is to make it harder to access for bots (but not impossible) while inconveniencing honest actors in the process. It doesn't sound like there's a straightforward answer to "are captchas DRM" either.