NSFW Detector



NSFW Detector

Not safe for work (NSFW) detector check and scans videos and images to checks and removes explicit images off the form.
Reason it protect minors on the internet to the best of our ability.

Note: This isn't going to be 100% accurate, but it does pretty good. Also note this is ALL Machine Learning algorithm CNN and NLP

Please also, don't go accessing the dataset or in the network of inspect element unless you want to add more data for more efficient scans AND you're a legal adult. This does contain sensitive imagery that may not be good for some user, but this is for training of the AI purpose ONLY. So please don't sue me.

Update log

  • 1.0.0 - Fully Released
  • 1.0.1 - Fix the configuration settings to appropriate input.

How it works

The script will be loaded onto the website which will automatically blur everything out and will create a buffer so it makes it impossible to un-blur the image while it's being scanned and deleted, if the image goes above the threshold, it will be removed from the frontend, but NOT in the backend. It will also blur and remove bad words off the list

What images/Videos does it remove

  • Pornographic imagery
  • Children exploitation/Underage children
  • Inappropriate Weapons (guns, bombs, and etc)
  • Sexual acts and "toys"
  • Sexual assault(SA)/Domestic violence(DV)
  • Nudity

Feature updates

  1. More datasets, but I also want to keep this lightweight.
  2. Other things like bug fixes and/or user requests.
  3. Detect uploaded Videos (Right now I believe Flatboard only supports image uploads and embedded 3rd party videos)

Internet safety laws

📝 License

Copyright © 2025 Flatboard Team
This project is GPL3 licensed.


Information
compatibilityFlatboard >=5.0.0
created22 Dec 2025
Updated29 Jan 2026