NSFW JS

NSFWJS is a JavaScript library meticulously crafted to detect potentially inappropriate images directly within a client’s browser, eliminating the need for image transmission to external servers.

This advanced library harnesses the capabilities of TensorFlowJS, a renowned open-source machine learning library for JavaScript. NSFWJS is finely tuned to identify specific image patterns, boasting an impressive current accuracy rate of 93%.

Incorporated into the library is the innovative CameraBlur Protection feature, designed to automatically blur any images deemed potentially inappropriate. The development of NSFWJS is an ongoing endeavor, with frequent releases of new models to ensure peak performance.

Users will be pleased to discover that NSFWJS is not only a powerful tool but also freely available for use, modification, and distribution under the MIT license. The library provides a mobile demo, enabling users to test various images on their mobile devices with ease.

For those interested in getting involved, NSFWJS is conveniently accessible for download via GitHub. Users are encouraged to contribute to its development and report any instances of false positives, fostering a robust and reliable tool for image content analysis.

As part of our community you may report an AI as dead or alive to keep our community safe, up-to-date and accurate.

An AI is considered “Dead AI” if the project is inactive at this moment.

An AI is considered “Alive AI” if the project is active at this moment.