Quick summary
NSFW JS is a client-side JavaScript toolkit created by Unbeaning to detect potentially explicit or inappropriate images directly in the user’s browser. It runs without sending images to a remote server and includes automated blurring for content flagged as problematic.
How it works
The library runs machine learning models in the browser (via TensorFlow.js) to analyze image content on the client side. Detection happens locally so you avoid round-trips to a backend, which helps with privacy and reduces server load.
Notable capabilities
- Runs entirely in-browser for on-device image evaluation and filtering
- Automatically blurs images that are flagged using the built-in CameraBlur feature
- Recognizes certain image patterns with about 93% reported accuracy
- Built on TensorFlow.js to leverage browser-compatible neural models
- Compact runtime footprint so it is practical for many front-end contexts
Package, licensing, and updates
NSFW JS is distributed under the MIT license. The project is actively maintained with periodic model improvements and library updates to improve detection and expand supported scenarios.
Size, demos, and deployment
The library is lightweight (around 4.2 MB) and includes a mobile-friendly demo to test behavior on phones and tablets. Because it runs client-side, deployment can be as simple as adding the script to your web app.
Community and alternatives
The project welcomes community contributions and pull requests to extend functionality or add new models. If you need a hosted or subscription-based service instead of a client-side library, evaluate commercial moderation APIs or managed offerings that provide server-side filtering and support.
Technical
- Web App
- Full