Rep. Joe Morelle, D.-N.Y., appeared with a New Jersey high school victim of nonconsensual sexually explicit deepfakes to discuss a bill stalled in the House.

  • TORFdot0@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    11 months ago

    the text of the bill exempts service providers from any liabilities as long as they make a good faith attempt to remove it as soon as they are aware of its existence. So if someone makes AI generated revenge porn on your instance as long as you take it down when notified, you want be in trouble.

      • TORFdot0@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        11 months ago

        Section 2252D (a) Offense.—Whoever, in or affecting interstate or foreign commerce, discloses or threatens to disclose an intimate digital depiction—

        “(1) with the intent to harass, annoy, threaten, alarm, or cause substantial harm to the finances or reputation of the depicted individual; or

        “(2) with actual knowledge that, or reckless disregard for whether, such disclosure or threatened disclosure will cause physical, emotional, reputational, or economic harm to the depicted individual,

        (d) Limitations.—For purposes of this section, a provider of an interactive computer service shall not be held liable on account of—

        “(1) any action voluntarily taken in good faith to restrict access to or availability of intimate digital depictions; or

        “(2) any action taken to enable or make available to information content providers or other persons the technical means to restrict access to intimate digital depictions.

        So the law requires intent and carves out exceptions for service providers that try to remove it.

        You can read the whole text here

        • General_Effort@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          11 months ago

          The lower part just says that overeager removal of depictions does not create liability. Say, onlyfans bans the account of a creator because some face recognition AI thought their porn depicted a celebrity. They have no recourse for lost income.

          As to the upper part, I am not sure what “reckless disregard” means in this context. I don’t think it means that you only have to act if you happen to receive a complaint. If you see nudes of some non-porn celebrity, then it’s mostly likely a fake. It seems reckless not to remove it immediately. What if there are not enough mods to look at each image. Is it reckless to keep operating?

          • TORFdot0@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            11 months ago

            (d) Limitations.—For purposes of this section, a provider of an interactive computer service shall not be held liable on account of—

            “(1) any action voluntarily taken in good faith to restrict access to or availability of intimate digital depictions; or

            “(2) any action taken to enable or make available to information content providers or other persons the technical means to restrict access to intimate digital depictions.

            I appreciate your reading into the text. I am not a lawyer so it isn’t always clear how to read the legal language crafted into these bills. Since the quoted part of the law is under the criminal penalty section of the bill, I read it as releasing the service provider from criminal liability if they try to stop the distribution of it. I see your point as how you read it and that makes sense to me

            • General_Effort@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              11 months ago

              Yes, expressions can have meanings that are unclear to non-experts, like reckless disregard. It means specific things in the context of specific laws and I can’t guess how it should be interpreted here.


              shall not be held liable on account of any action taken

              1. to restrict access.

              2. to make available the technical means to restrict access.

              I took some words out to improve readability.

              I believe the second one is for, EG, someone making a database of banned material, so that it can be filtered automatically on upload. Or if someone uses those images to train an AI to recognize fakes. For that purpose it will be necessary to “disclose” (IE distribute) the images to the people working on it; perhaps an outside company.