Detect Lewd Content in Images with Feluda

This notebook demonstrates how to use the DetectLewdImages operator to analyze images for inappropriate content. It processes a sample image and displays the probability score indicating likelihood of lewd content.

GitHub Open In Colab

Install dependencies conditionally based on whether the notebook is running in Colab or locally.

%%time
import sys

IN_COLAB = "google.colab" in sys.modules
print("Running Notebook in Google Colab" if IN_COLAB else "Running Notebook locally")

if IN_COLAB:
    # Since Google Colab has preinstalled libraries like tensorflow and numba, we create a folder called feluda_custom_venv and isolate the environment there.
    # This is done to avoid any conflicts with the preinstalled libraries.
    %pip install uv
    !mkdir -p /content/feluda_custom_venv
    !uv pip install --target=/content/feluda_custom_venv --prerelease allow feluda feluda-detect-lewd-images > /dev/null 2>&1

    sys.path.insert(0, "/content/feluda_custom_venv")
else:
    !uv pip install feluda feluda-detect_lewd-images > /dev/null 2>&1
Running Notebook locally
Using Python 3.10.12 environment at: /home/aatman/Aatman/Tattle/feluda/.venv
Audited 6 packages in 11ms
CPU times: user 6.38 ms, sys: 4.13 ms, total: 10.5 ms
Wall time: 138 ms

We’ll use one operator for this example.

from feluda.factory import ImageFactory
from feluda.operators import DetectLewdImages

detector = DetectLewdImages()
IMAGE_URL = "https://github.com/tattle-made/feluda_datasets/blob/main/feluda-sample-media/people.jpg"

raw_url = IMAGE_URL.replace("/blob/", "/raw/")

# Download image using ImageFactory
image_obj = ImageFactory.make_from_url_to_path(raw_url)

# Analyze image for inappropriate content
# Returns probability score (0.0 to 1.0) indicating likelihood of lewd content
probability = detector.run(image_obj, remove_after_processing=True)
Downloading image from URL

Image downloaded

In the below codeblock, we are predicting the probability of lewd content present in an image using the feluda-detect-lewd-images operator. The operator uses the Private Detector model from Bumble.

print(f"Lewd content probability: {probability:.4f}")

# Interpret the result
if probability < 0.3:
    print("Result: SAFE - Low probability of inappropriate content")
elif probability < 0.7:
    print("Result: MODERATE - Medium probability of inappropriate content")
else:
    print("Result: UNSAFE - High probability of inappropriate content")
Lewd content probability: 0.0543
Result: SAFE - Low probability of inappropriate content
# Clean up resources when you're done

detector.cleanup()