Not a good look for Mastodon - what can be done to automate the removal of CSAM?

      • DaddyLoïc© ⏚@g33ks.coffee
        link
        fedilink
        arrow-up
        5
        arrow-down
        5
        ·
        1 year ago

        @Arotrios @corb3t @redcalcium perhaps we should learn to not stand behind cloudflare at all! their proxy:
        - is filtering real people,
        - is blocking randomly some requests between activity pub servers ❌

        the best way to deal with non solicited content is human moderation, little instance, few people, human scale… #smallWeb made of lots of little instances without any need of a big centralized proxy… 🧠

        some debates: 💡 https://toot.cafe/@Coffee/109480850755446647
        https://g33ks.coffee/@coffee/110519150084601332

        • P03 Locke@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          I trust CloudFlare a helluva lot more than I trust most of these companies discussed on this thread. Their transparency is second to none.

        • Arotrios@kbin.social
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          Thanks for the comment - I wasn’t aware of a cloudflare controversy in play, and went through your links and the associated wikipedia page. It’s interesting to me, as someone who previously ran a public forum, to see them struggle with the same issues surrounding hate speech I did on a larger scale.

          I agree with your thoughts on a centralized service having that much power, although Cloudfare does have a number of competitors, so I’m not quite seeing the risk here, save for the fact that Cloudfare appears to be the only one offering CSAM filtering (will have to dig in further to confirm). The ActivityPub blocking for particular instances is concerning, but I don’t see a source on that - do you have more detail?

          However, I disagree with your statement on handling non-solicited content - from personal experience, I can confidently state that there are some things that get submitted that you just shouldn’t subject another human too, even if it’s only to determine whether or not it should be deleted. CSAM falls under this category in my book. Having a system in place that keeps you and your moderators from having to deal with it is invaluable to a small instance owner.

          • DaddyLoïc© ⏚@g33ks.coffee
            link
            fedilink
            arrow-up
            3
            arrow-down
            2
            ·
            1 year ago

            @Arotrios @corb3t @redcalcium it’s all about community & trust. I agree that no one should ever has to deal with such offensive content, and my answer again is: running a small instance, with few people, creating a safe space, building trust… ☮️

            Of course it’s a different approach about how to create and maintain an online community I guess. We don’t have to deal with non solicited content here because we are 20 and we kind of know each other, subscription is only available by invitation, so you are kind of responsible for who you’re bringing here… and we care about people, each of them! Again, community & trust over any tools 👌

            obviously we do not share the same vision here, but it’s ok, I’m not trying to convince, I just wanted to say our approach is completely different 😉

            more about filtering content: https://www.devever.net/~hl/cloudflare 💡