In less than a week since the AI poisoning tool known as Nightshade was made available online for free, more than 250,000 people have downloaded and started to use it. Artists and creators can use the tool to tag their images at the pixel level which is undetectable to the human eye, but wreaks havoc on AI learning models. It is part of an effort known as the Glaze Project, designed to “increase the cost of training on unlicensed data and making licensing images from creators a more attractive o...
Published on February 06, 2024 06:00