Title, or at least the inverse be encouraged. This has been talked about before, but with how bad things are getting, and how realistic goods ai generated videos are getting, anything feels better than nothing, AI generated watermarks, or metadata can be removed, but thats not the point, the point is deterrence. Immediately all big tech will comply (atleast on the surface for consumer-facing products), and then we will probably see a massive decrease in malicious use of it, people will bypass it, remove watermarks, fix metadata, but the situation should be quite a bit better? I dont see many downsides/

  • jjmoldy@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    4 days ago

    How would such a law be enforced? What agency would enforce it? What penalty would one face for breaking this law?

    • queermunist she/her@lemmy.ml
      link
      fedilink
      arrow-up
      4
      arrow-down
      3
      ·
      edit-2
      4 days ago

      Force the AI models to contain some kind of metadata in all their material. Training AI models is a massive undertaking, it’s not like they can hide what they’re doing. We know who is training these models and where their data centers are, so a regulatory agency would certainly be able to force them to comply.

      In the US this could be done with the FCC, in other countries the power can be invested into regulatory bodies that control communications and broadcasting etc.

      The penalty? Break them on the fucking wheel.

        • queermunist she/her@lemmy.ml
          link
          fedilink
          arrow-up
          2
          ·
          4 days ago

          If they can find cannabis grow ops from power usage, they certainly can find people using massive amounts of data and processing power and public water and investor cash to train AI. You expect me to believe this could be done in secret?

          • FourWaveforms@lemm.ee
            link
            fedilink
            arrow-up
            1
            ·
            2 days ago

            Yes.

            Using AI models and playing 3D games both use the same hardware and electricity. You can run an LLM or a diffusion model on a home computer, and many do. Is someone going to show up with a clipboard, and demand to see your Steam account?

            • queermunist she/her@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              1 day ago

              Training AI models is completely different, though. That requires massive amounts of compute and data and electricity and water, and that’s all very easy for the government to track.

              • FourWaveforms@lemm.ee
                link
                fedilink
                arrow-up
                1
                ·
                17 hours ago

                If someone trains an open source AI model to fingerprint its output, someone else can use abliteration or other methods to defeat that. It will not require re-training. An example of this is deepseek-r1’s “1776” variant, where someone uncensored it, and now it will talk freely about Tiananmen Square.

                Even without that, it’s not practical for a government to find all instances of model training. Thousands of people can rent the same GPUs in the same data centers. A small organization training one model can have the same power consumption as a large organization running inference. It would take advanced surveillance to get around that.

                It’s also becoming possible to train larger and larger models without needing a data center at all. nVidia is coming out with a 128GB desktop machine that delivers 1 petaflop @ FP4 for 170 watts. FP8 would be on the order of hundreds of teraflops. Ten of them could talk over an InfiniBand switch. You could run that setup in an apartment, or in a LAN closet.

                • queermunist she/her@lemmy.ml
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  edit-2
                  5 hours ago

                  It’s practical for a government to regulate Microsoft, Google, Amazon, OpenAI, etc. Who cares if they can’t catch everything? Focusing on the biggest problems is perfectly fine imo, the worst offenders are the biggest companies as usual.

                  Your company’s AI model got retrained and used in a way that violates regulations? Whelp, looks like your company is liable for that. Oh, that wasn’t done by your company or anyone involved? Too fucking bad, should have made it harder to retrain your model.

                  And if they resist, break them on the fucking wheel.

                  You act like it’s impossible and so we shouldn’t even try, which is honestly just an anti regulation talking point that is trotted out for literally everything.

      • jjmoldy@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        4 days ago

        Medieval torture in response to what is essentially copyright infringement. Very sane!

          • jjmoldy@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            4 days ago

            Well that’s certainly less extreme than breaking on the wheel, I’ll give you that, but it doesn’t seem very realistic in most countries, where nationalization is rare and done mainly for strategic purposes.

            • queermunist she/her@lemmy.ml
              link
              fedilink
              arrow-up
              4
              arrow-down
              2
              ·
              edit-2
              4 days ago

              Well the most realistic thing is that there will be no regulations or if there are regulations they’re toothless fines or something.

              I didn’t realize we were limiting ourselves to our backwards political system where the rich and powerful write their own regulations.

              Nothing will be done, realistically.

              Nothing is ever done about anything.

              • jjmoldy@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                3
                ·
                4 days ago

                I gather from your username that you consider yourself a communist? How do you suppose your ambitions could be put into reality when the movement is so devastatingly weak and disorganized?

                • queermunist she/her@lemmy.ml
                  link
                  fedilink
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  4 days ago

                  Things only look that way when you’re a Western Marxist and reject actually existing socialism around the world. China is hardly weak or disorganized.

                  Or do you mean AI regulation? I think it’s probably best to just focus on AI being used for war and struggle against that (No Tech For Apartheid comes to mind), rather than try and tackle all AI everywhere all at once.

                  • jjmoldy@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    edit-2
                    4 days ago

                    To be clear I am not a socialist or communist, Western or otherwise. Yes, China is ascendent on the world stage and likely will continue to be, but they have shown no willingness to aid communist movements abroad. That can change of course but it would seem to me that the CPC is more concerned with maintaining international relationships and economic agreements than fomenting the global revolution. I also somewhat doubt the party’s commitment to eventually ‘withering away’ as Marx put it. To be fair, I know that couldn’t happen unless the whole world was on board or they’d get promptly steamrolled by one adversary or another.