I’m talking about this sort of thing. Like clearly I wouldn’t want someone to see that on my phone in the office or when I’m sat on a bus.

However there seems be a lot of these that aren’t filtered out by nsfw settings, when a similar picture of a woman would be, so it seems this is a deliberate feature I might not be understanding.

Discuss.

  • AnIndefiniteArticle
    link
    fedilink
    134 months ago

    I’ve seen sites that have something similar, including a “suggestive” tag for pics like OP’s.

    • @peanuts4life@lemmy.blahaj.zone
      link
      fedilink
      English
      54 months ago

      Yeah, that would be great. Many instance admins already use CSAM classifier models on all incoming images. It’d be great if they could add additional models that could put meta tags on images automatically like “suggestive” and “gore” with the option for the poster to modify the tags just in case it was a false negative or positive. Like a lasagna getting gore, for example.