Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It’s designed to restrict the purposes for which the consumer can use the work. It is exactly like DRM in this way.


To be clear, you can still train AI with these images. Nothing is stopping you.


To quote the source:

>Nightshade's goal is not to break models, but to increase the cost of training on unlicensed data, such that licensing images from their creators becomes a viable alternative.

Which feels similar to DRM. To discourage extraction of assets.


It also degrades the quality of the image for human consumers. It’s just a matter of what someone is willing to publish to “the public.”


Sure. Just like how video game drm impacts performance and watermarks on images degrades the image. Drm walks a tight line that inevitably makes the result worse than a drum-free solution, but also should not make the item completely unconsumable.


Video game DRM completely prevents people without a license/key to unlock it from accessing the game at all.


So, do you want to define drm by intent or technical implementation? I'm doing the former, but it sounds like you want to do the latter. Also keep in mind that legalese doesn't necessarily care about the exact encryption technique to be deployed either.


Both. Changing an image is done all the time prior to publishing them. In fact, no image you ever see on the internet is a raw sensor output. They are all modified in some manner. The images processed using this method look the same to every person and computer that views them. That’s very different from DRM which encrypts things and prevents access to unprivileged users.

This is effectively the equivalent of someone doing really crappy image processing. As other commenters have mentioned, it does alter how images look to humans as well as machines, and it can be “mitigated” through additional processing techniques.


>That’s very different from DRM which encrypts things and prevents access to unprivileged users.

Well you can call it a captcha if you want. The point here is to make it harder to access for bots (but not impossible) while inconveniencing honest actors in the process. It doesn't sound like there's a straightforward answer to "are captchas DRM" either.


How does it stop you from using an image however you want?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: