Skip to main content

Image

Supported formats

BMP, GIF, JPEG, PNG, TIFF, WebP, SVG, ICO

Description

This backend performs processing of image data in various formats, extracts metadata (including Exif) and performs additional analysis if necessary.

info

Available in Contextal Platform 1.0 and later.*

Features

NSFW Detection

The backend uses a neural network to detect images with NSFW (Not Safe For Work) content, additionally providing a category of such content. It uses GantMan's model to identify the following categories:

  • Drawing - safe for work drawings (including anime)
  • Hentai - hentai and pornographic drawings
  • Porn - pornographic images, sexual acts
  • Sexy - sexually explicit images, not pornography
  • Neutral - safe for work neutral images

The final verdict is stored in the object's metadata under the $nsfw_verdict key, which may have a value of Unknown in case there wasn't enough confidence to assign one of the above categories.

Optical Character Recognition

The backend performs optical character recognition (OCR) when requested and extracts text for further processing. The text processing backend can later detect the text's language, sentiment, profanities, embedded URLs, potential passwords, and more.

Symbols

Object

  • LIMITS_REACHED → limits triggered while processing the image

Children

  • TOOBIG → text extracted via OCR was not stored as it exceeds the limits

Example Metadata

{
"org": "ctx",
"object_id": "d7ba6bc532a225c955411cb96c733a45ee39403fa973312bded7732e6f8e4b3c",
"object_type": "Image",
"object_subtype": "JPEG",
"recursion_level": 1,
"size": 425890,
"hashes": {
"sha1": "4cc5618c434ec5d02559e221eb4f10e5c748bddd",
"md5": "23b313574a1e61545db171a23edd73b3",
"sha256": "d7ba6bc532a225c955411cb96c733a45ee39403fa973312bded7732e6f8e4b3c",
"sha512": "773338b1c897ab1370a15ef2b7e9b014c948ce60c3c3ec14d1cebdd77f7e65a686379363edd4a7cfefe8b0620253ea25118abd74feb35d3e8807d262cd0380fe"
},
"ctime": 1726504950.87833,
"ok": {
"symbols": [],
"object_metadata": {
"_backend_version": "1.0.0",
"exif": {
"primary": {
"ColorSpace": "sRGB",
"ComponentsConfiguration": "YCbCr_",
"ExifVersion": "2.2",
"ExposureTime": "1/55.13513513513514 s",
"FlashpixVersion": "1.0",
"PhotographicSensitivity": "100",
"PixelXDimension": "2048 pixels",
"PixelYDimension": "1536 pixels",
"ResolutionUnit": "inch",
"XResolution": "72 pixels per inch",
"YCbCrPositioning": "co-sited",
"YResolution": "72 pixels per inch"
},
"thumbnail": {}
},
"format": "jpeg",
"height": 1536,
"nsfw_predictions": {
"Drawings": 0.25242,
"Hentai": 0.025916,
"Neutral": 0.57845,
"Porn": 0.077091,
"Sexy": 0.066123
},
"nsfw_verdict": "Neutral",
"pixel_format": "RGB8",
"width": 2048
},
"children": []
}
}

Example Queries

object_type == "Email"
&& @has_descendant(object_image == "Image" &&
@match_object_meta($nsfw_verdict == "Hentai")
|| @match_object_meta($nsfw_verdict == "Sexy")
|| @match_object_meta($nsfw_verdict == "Porn")
)
  • This query matches an Email object, which at some level contains an Image object with NSFW content.
object_type == "Image"
&& @has_child(object_type == "Text"
&& @match_object_meta($natural_language_profanity_count > 0))
  • This query matches an Image object, out of which a Text object was extracted (via OCR), in which some profanities were identified.

Configuration Options

  • max_child_output_size → maximum size of a single output children object (default: 41943040)