• Even_Adder@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    4
    ·
    11 months ago

    They aren’t insufficient, they are working just fine. In the US, fair use balances the interests of copyright holders with the public’s right to access and use information. There are rights people can maintain over their work, and the rights they do not maintain have always been to the benefit of self-expression and discussion. We shouldn’t be trying to make that any worse.

    • Dkarma@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      4
      ·
      11 months ago

      Yep. Copyright should not include “viewing or analyzing the picture” rights. Artists want to start charging you or software to even look at their art they literally put out for free. If u don’t want your art seen by a person or an AI then don’t publish it.

      • EldritchFeminity@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        6
        ·
        11 months ago

        Copyright should absolutely include analyzing when you’re talking about AI, and for one simple reason: companies are profiting off of the work of artists without compensating them. People want the rewards of work without having to do the work. AI has the potential to be incredibly useful for artists and non artists alike, but these kinds of people are ruining it for everybody.

        What artists are asking for is ethical sourcing for AI datasets. We’re talking paying a licensing fee or using free art that’s opt-in. Right now, artists have no choice in the matter - their rights to their works are being violated by corporations. Already the music industry has made it illegal to use songs in AI without the artist’s permission. You can’t just take songs and make your own synthesizer out of them, then sell it. If you want music for something you’re making, you either pay a licensing fee of some kind (like paying for a service) or use free-use songs. That’s what artists want.

        When an artist, who does art for a living, posts something online, it’s an ad for their skills. People want to use AI to take the artist out of the equation. And doing so will result in creativity only being possible for people wealthy enough to pay for it. Much of the art you see online, and almost all the art you see in a museum, was paid for by somebody. Van Gogh died a poor man because people didn’t want to buy his art. The Sistine Chapel was commissioned by a Pope. You take the artist out of the equation and what’s left? Just AI art made as a derivative of AI art that was made as a derivative of other art.

          • EldritchFeminity@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            2
            ·
            11 months ago

            MidJourney is already storing pre-rendered images made from and mimicking around 4,000 artists’ work. The derivative works infringement is already happening right out in the open.

            • Even_Adder@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              11 months ago

              Something being derivative doesn’t mean it’s automatically illegal or improper.

              First, copyright law doesn’t prevent you from making factual observations about a work or copying the facts embodied in a work (this is called the “idea/expression distinction”). Rather, copyright forbids you from copying the work’s creative expression in a way that could substitute for the original, and from making “derivative works” when those works copy too much creative expression from the original.

              Second, even if a person makes a copy or a derivative work, the use is not infringing if it is a “fair use.” Whether a use is fair depends on a number of factors, including the purpose of the use, the nature of the original work, how much is used, and potential harm to the market for the original work.

              Even if a court concludes that a model is a derivative work under copyright law, creating the model is likely a lawful fair use. Fair use protects reverse engineering, indexing for search engines, and other forms of analysis that create new knowledge about works or bodies of works. Here, the fact that the model is used to create new works weighs in favor of fair use as does the fact that the model consists of original analysis of the training images in comparison with one another.

              You are expressly allowed to mimic others’ works as long as you don’t substantially reproduce their work. That’s a big part of why art can exist in the first place. You should check out that article I linked.

              • EldritchFeminity@lemmy.blahaj.zone
                link
                fedilink
                English
                arrow-up
                2
                ·
                11 months ago

                I actually did read it, that’s why I specifically called out MidJourney here, as they’re one I have specific problems with. MidJourney is currently caught up in a lawsuit partly because the devs were caught talking about how they launder artists’ works through a dataset to then create prompts specifically for reproducing art that appears to be made by a specific artist of your choosing. You enter an artist’s name as part of the generating parameters and you get a piece trained on their art. Essentially using LLM to run an art-tracing scheme while skirting copyright violations.

                I wanna make it clear that I’m not on the “AI evilllll!!!1!!” train. My stance is specifically about ethical sourcing for AI datasets. In short, I believe that AI specifically should have an opt-in requirement rather than an opt-out requirement or no choice at all. Essentially creative commons licensing for works used in data sets, to ensure that artists are duly compensated for their works being used. This would allow artists to license out their portfolios for use with a fee or make them openly available for use, however they see fit, while still ensuring that they still have the ability to protect their job as an artist from stuff like what MidJourney is doing.

                • Even_Adder@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  2
                  ·
                  11 months ago

                  I actually did read it, that’s why I specifically called out MidJourney here, as they’re one I have specific problems with. MidJourney is currently caught up in a lawsuit partly because the devs were caught talking about how they launder artists’ works through a dataset to then create prompts specifically for reproducing art that appears to be made by a specific artist of your choosing. You enter an artist’s name as part of the generating parameters and you get a piece trained on their art. Essentially using LLM to run an art-tracing scheme while skirting copyright violations.

                  I’m pretty sure that’s all part of the discovery from the same case where Midjourney is named as a defendant along with Stability AI, it isn’t its own distinct case. It’s also not illegal or improper to do what they are doing. They aren’t skirting copyright law, it is a feature explicitly allowed by it so that you can communicate without the fear of reprisals. Styles are not something protected by copyright, nor should they be.

                  I wanna make it clear that I’m not on the “AI evilllll!!!1!!” train. My stance is specifically about ethical sourcing for AI datasets. In short, I believe that AI specifically should have an opt-in requirement rather than an opt-out requirement or no choice at all. Essentially creative commons licensing for works used in data sets, to ensure that artists are duly compensated for their works being used. This would allow artists to license out their portfolios for use with a fee or make them openly available for use, however they see fit, while still ensuring that they still have the ability to protect their job as an artist from stuff like what MidJourney is doing.

                  You can’t extract compensation from someone doing their own independent analysis for the aim of making non-infringing novel works, and you don’t need licenses or permission to exercise your rights. Singling out AI in this regard doesn’t make sense because it isn’t a special system in that regard. That would be like saying dolphin developers have to pay Nintendo every time someone downloads their emulator.

                  • EldritchFeminity@lemmy.blahaj.zone
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    11 months ago

                    You do realize that you basically just confirmed every fear that artists have over AI, right? That they have no rights or protections to prevent anybody from coming along and using their work to train an LLM to create imitation works for cheaper than they can possibly charge for their work, thereby putting them out of business? Because in the end, a professional in any field is nothing more than the sum of the knowledge and experience they’ve accrued over their career; a “style” as you and MidJourney put it. And so long as somebody isn’t basically copy+pasting a piece, then it’s not violating copyright, because it’s not potentially harming the market for the original piece, even if it is potentially harming the market for the creator of said piece.

                    The Dolphin analogy is also incorrect (though an interesting choice considering they got pulled from the Steam store after the threat of legal action by Nintendo, but I think you and I feel the same way on that issue - Dolphin has done nothing wrong). A better analogy would be if Unreal created an RPGMaker style tool for generating an entire game of any genre you want in Unreal Engine at the push of a button by averaging a multitude of games across different genres to generate the script. If they didn’t get permission to use said games, either by paying a one time fee, an ongoing fee, or using games that expressly give permission for said use, I’m sure the developers/publishers would be rather unhappy with Unreal. Could it be incredibly beneficial and vastly improve the process of creating games for the industry? Absolutely. If they released it for free, could it be used by anybody and everybody to make imitation Ubisoft games, or any other developer, and run the risk of strangling the industry with even more trash games with no soul in them? Also absolutely. And a big AAA publisher has a lot more ability to deal with knock-offs/competition like that than your average starving artist. The indie game scene is the strongest it’s ever been thanks to the rise of digital storefronts, but how many great indie game developers go under after producing their first game and never make a second? The vast majority. Because indie games almost never make a profit, meaning they can’t afford to make another.

                    The issue with AI is that it opens a whole can of worms in the form of creating an industrial scale imitation generator that anybody can use at the push of a button. And the general public have long made known their disdain for properly compensating artists for the work that they do, and have already been gleefully doing a corporation by using AI to avoid having to hire artists. This runs the risk of creating a chilling effect in the field of creativity and the arts, as your average independent artist can no longer afford to keep doing art thanks to the wonders of capitalism. There will always be people who do art as a hobby, but professional artists as we think of them today? Why go into debt by training at an art school if all your job prospects have been replaced because people generate art for free with some form of LLM instead of hiring artists. I myself never went into art beyond a hobby level despite wanting to because of how abysmal the job prospects were even 15 years ago. And I simply cannot afford to do it as much as I’d like (if at all) between work, the time investment, and the expense of it. And that’s not even getting into the issues of LLM generated porn of people, advertisements generated using the voices of dead (and still alive) celebrities, scams made using the voices of relatives, and all the other ethical issues.

                    I used to work at a fish market with a kid who was a trained electrician who was set to follow in the footsteps of his dad who had been one of the highest paid electricians in the US, except he gave up on it because the thing he liked doing the most in the field was replaced with a machine by the time he graduated from technical school. Obviously the machine is more efficient (and probably safer), but instead of entering the field at all, he ended up working a job he hated and to this day has never found a job he has any passion for. What happens to art when professional artists are only NEETs, who have minimal living expenses, and those hired by corporations and the wealthy? Are we going to get the fine art market on steroids, with the masses only having access to AI generated art that will degrade in quality over time as the only new inputs are previous AI generated pieces, unless there’s enough hobby artists to provide sufficient new art, while the wealthy hold a monopoly on human-made art that the rest of us will probably never see?

                    This is all pure speculation, but it’s the Jurassic Park question: “Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.”

      • Even_Adder@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        edit-2
        11 months ago

        It’s sad some people feel that way. That kind of monopoly on expression and ideas would only serve to increase disparities and divisions, manipulate discourse in subtle ways, and in the end, fundamentally alter how we interact with each other for the worse.

        What they want would score a huge inadvertent home run for corporations and swing the doors open for them hindering competition, stifling undesirable speech, and monopolizing spaces like nothing we’ve seen before. There are very good reasons we have the rights we have, and there’s nothing good that can be said about anyone trying to make them worse.

        Also, rest assured they’d collude with each other and only use their new powers to stamp out the little guy. It’ll be like American ISPs busting attempts at municipal internet all over again.