Nightshade, the tool that ‘poisons’ data, gives artists a fighting chance against AI

Intentionally poisoning someone else is never morally right. But if someone in the office keeps swiping your lunch, wouldn’t you resort to petty vengeance?
For artists, protecting work from being used to train AI models without consent is an uphill battle. Opt-out requests and do-not-scrape codes rely on AI companies to engage in good faith, but those motivated by profit over privacy can easily disregard such measures. Sequestering themselves offline isn’t an option for most artists, who rely on social media exposure for commissions and other work opportunities. 
Nightshade, a project from the University of Chicago, gives artists …