Artbreeder: Artbreeder uses an automated system that flags generated images as NSFW (Not Safe For Work) when they are detected as potentially explicit. The system applies automatically to all images on the platform, and users can dispute incorrect flags through a separate AI-powered checker. | AI Trace
Content ModerationVerified
Artbreeder uses an automated system that flags generated images as NSFW (Not Safe For Work) when they are detected as potentially explicit. The system applies automatically to all images on the platform, and users can dispute incorrect flags through a separate AI-powered checker.
Details
Artbreeder's NSFW detection system automatically classifies images at the point of generation or upload and attaches a flag to images deemed potentially inappropriate. According to Artbreeder's official updates page, when users believe an image has been incorrectly flagged, they can trigger a secondary 'more powerful and costlier checker' to review the image. Free users receive three such dispute checks per day; paid subscribers receive unlimited. The platform describes the system as imperfect and acknowledges false positives. The underlying technology of the NSFW checker is not publicly specified by Artbreeder beyond describing it as an automated system.