Skip to main content

Alert on NSFW Content

· One min read
Contextal Team
Contextal Platform Creators

Contextal Platform's image processor includes capability to detect NSFW graphical content. We will create a scenario triggering an alert action when such images are detected in the data flow.

The base of our scenario will be the following ContexQL query, checking the value of $nsfw_verdict key in Image metadata:

object_type == "Image" &&
@match_object_meta($nsfw_verdict == "Hentai")
|| @match_object_meta($nsfw_verdict == "Sexy")
|| @match_object_meta($nsfw_verdict == "Porn")
tip

Depending on your needs or the integration requirements, you may consider changing the action to BLOCK or QUARANTINE.

info

Click on the download button below to get the scenario and then upload it using Contextal Console or the ctx command line tool (when using the latter, don't forget to reload remote scenarios after adding a new one!)

NSFW-Graphics.json
{
"name": "NSFW Graphics",
"min_ver": 1,
"max_ver": null,
"creator": "Contextal",
"description": "Alert on NSFW images.",
"local_query": "object_type == \"Image\" && \n @match_object_meta($nsfw_verdict == \"Hentai\")\n || @match_object_meta($nsfw_verdict == \"Sexy\") \n || @match_object_meta($nsfw_verdict == \"Porn\")",
"context": null,
"action": "ALERT"
}