How to Classify NSFW (Not Safe for Work) Imagery with AI Content Moderation using Java

The purpose of this article is to highlight some of the contemporary challenges in moderating degrees of explicit NSFW (Not Safe for Work) image content on websites and to demonstrate a cloud-based Artificial Intelligence Content Moderation API which can be deployed to increase the efficacy of the content moderation process.

Pornographic images are typically banned on mainstream websites and professional networks. That’s because failing to ban such content means exposing website patrons and employees alike to unsolicited, sexually explicit imagery, which can amount to charges of sexual harassment, depending on how litigious your region of the world is. Enforcing a ban on such content is no small task, however, due to the large volume of image files that are uploaded to content-curating networks each day, and in part due to the difficulty of clearly defining policies against imagery which is some degree of sexually suggestive (i.e., content that is racy) rather than fully pornographic. 

CategoriesUncategorized