This is an NSFW content detector based on Falconsai/nsfw_image_detection.
Model: google/vit-base-patch16-224-in21k
You can try it online(using Public API): NSFW Detector
Compared to other common NSFW detectors, this detector has the following advantages:
- AI-based, providing better accuracy.
- Supports CPU-only inference, can run on most servers.
- Automatically utilizes multiple CPUs to accelerate inference.
- Simple classification with only two categories: nsfw and normal.
- Provides service via API, making it easier to integrate with other applications.
- Docker-based deployment, suitable for distributed deployment.
- Purely local, protecting your data security.
Running this model requires up to 2GB of memory. No GPU support is needed.
When handling a large number of requests simultaneously, more memory may be required.
This detector supports checking the following file types:
- ✅ Images (supported)
- ✅ PDF files (supported)
- ✅ Videos (supported)
- ✅ Files in compressed packages (supported)
docker run -d -p 3333:3333 --name nsfw-detector vxlink/nsfw_detector:latest
Supported architectures: x86_64
, ARM64
.
# Detection
curl -X POST -F "file=@/path/to/image.jpg" http://localhost:3333/check
Visit: http://localhost:3333
You can use the public API service provided by vx.link.
# Detect files, automatically recognize file types
curl -X POST -F "file=@/path/to/image.jpg" https://vx.link/public/nsfw
- Your submitted images will not be saved.
- Please note that the API rate limit is 30 requests per minute.
This project is open-source under the Apache 2.0 license.