AI Poison Tool Against Web Crawlers Finally Cracked, Users Must Wait for Weeks

A popular "AI poison" tool to protect user content from data web crawlers has finally cracked amid the surge in demand for the tool and increasingly sophisticated AI tools able to bypass its guardrails.

As noted in its X (formerly Twitter) post, the Glaze Project shared that it is pausing its services for several weeks following an increase to over 200 daily requests each from X and Instagram.

AI Poison Tool Against Web Crawlers Finally Cracked, Users Must Wait for Weeks

(Photo : Andriy Onufriyenko/Getty Images)

The Glaze Project, which is currently an invite-only service, is taking the break to "rethink the way we run webglaze right now" and provide a more sustainable platform for a larger user base.

The announcement came shortly after the new Facebook and Instagram terms of service, allowing the platforms to collect users' posts to train for their parent company Meta's AI models.

Project leader and University of Chicago professor Ben Zhao also indicated plans to improve the AI tool's guardrails after an earlier study proved that protection tools remain vulnerable to "AI mimicry."

Also Read: University of Chicago Researchers Have Found a Way to 'Poison' Training Data for AI

Art Thief Surge Amid AI Demand Increases

The Glaze Project is among the most widely-used "AI poison" tools since it was first launched by the University of Chicago researchers late last year in response to growing concerns of AI stealing content from users and artists.

Just earlier this year, popular AI-image generator Midjourney was caught stealing artworks from thousands of artists following the release of a spreadsheet its developers use to train their AI to mimic the artists' style.

Several artists have since filed copyright violation lawsuits against the AI firms but have since only seen minimal success due to the limited laws that can rule over the new technology.

Related Article: How to Protect Your Face, Voice from Being Used in AI Deepfakes

Alternative 'AI Poison' Tools

Although the Glaze Project is currently paused, concerned artists and users can still employ some techniques to protect their works the same way the "AI poison" tool does to some degree.

This can be done by adding an opaque a distorted layer, preferably with huge color variations, to prevent the AI from fully mimicking the image or video.

The technique works as it distorts the image's detail and prevents AI models from recognizing the media properly, impeding its growth.

Best of all, this method can be done in almost all of the graphic editing and digital drawing apps.

© 2024 iTech Post All rights reserved. Do not reproduce without permission.

More from iTechPost