• IninewCrow@lemmy.ca
    link
    fedilink
    English
    arrow-up
    79
    arrow-down
    1
    ·
    29 days ago

    Nice … I look forward to the next generation of AI counter counter measures that will make the internet an even more unbearable mess in order to funnel as much money and control to a small set of idiots that think they can become masters of the universe and own every single penny on the planet.

    • IndiBrony@lemmy.world
      link
      fedilink
      English
      arrow-up
      48
      ·
      29 days ago

      All the while as we roast to death because all of this will take more resources than the entire energy output of a medium sized country.

      • ikt@aussie.zone
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        15
        ·
        29 days ago

        we’re rolling out renewables at like 100x the rate of ai electricity use, so no need to worry there

        • Serinus@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          ·
          29 days ago

          Yeah, at this rate we’ll be just fine. (As long as this is still the Reagan administration.)

          • ikt@aussie.zone
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            6
            ·
            29 days ago

            yep the biggest worry isn’t AI, it’s India

            https://www.worldometers.info/co2-emissions/india-co2-emissions/

            The west is lowering its co2 output while India is slurping up all the co2 we’re saving:

            This doesn’t include China of course, the most egregious of the co2 emitters

            AI is not even a tiny blip on that radar, especially as AI is in data centres and devices which runs on electricity so the more your country goes to renewables the less co2 impacting it is over time

            • Semjaza@lemmynsfw.com
              link
              fedilink
              English
              arrow-up
              7
              arrow-down
              1
              ·
              29 days ago

              Could you add the US to the graphs, as EU and West are hardly synonymous - even as it descends into Trumpgardia.

      • Zozano@aussie.zone
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        18
        ·
        edit-2
        26 days ago

        I’ve been thinking about this for a while. Consider how quick LLM’s are.

        If the amount of energy spent powering your device (without an LLM), is more than using an LLM, then it’s probably saving energy.

        In all honesty, I’ve probably saved over 50 hours or more since I started using it about 2 months ago.

        Coding has become incredibly efficient, and I’m not suffering through search-engine hell any more.

        Edit:

        Lemmy when someone uses AI to get a cheap, fast answer: “Noooo, it’s killing the planet!”

        Lemmy when someone uses a nuclear reactor to run Doom: Dark Ages on a $20,000 RGB space heater: “Based”

        • xthexder@l.sw0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          29 days ago

          Just writing code uses almost no energy. Your PC should be clocking down when you’re not doing anything. 1GHz is plenty for text editing.

          Does ChatGPT (or whatever LLM you use) reduce the number of times you hit build? Because that’s where all the electricity goes.

          • Zozano@aussie.zone
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            edit-2
            28 days ago

            Except that half the time I dont know what the fuck I’m doing. It’s normal for me to spend hours trying to figure out why a small config file isnt working.

            That’s not just text editing, that’s browsing the internet, referring to YouTube videos, or wallowing in self-pity.

            That was before I started using gpt.

            • xthexder@l.sw0.com
              link
              fedilink
              English
              arrow-up
              5
              ·
              edit-2
              29 days ago

              It sounds like it does save you a lot of time then. I haven’t had the same experience, but I did all my learning to program before LLMs.

              Personally I think the amount of power saved here is negligible, but it would actually be an interesting study to see just how much it is. It may or may not offset the power usage of the LLM, depending on how many questions you end up asking and such.

              • Zozano@aussie.zone
                link
                fedilink
                English
                arrow-up
                4
                ·
                29 days ago

                It doesn’t always get the answers right, and I have to re-feed its broken instructions back into itself to get the right scripts, but for someone with no official coding training, this saves me so much damn time.

                Consider I’m juggling learning Linux starting from 4 years ago, along with python, rust, nixos, bash scripts, yaml scripts, etc.

                It’s a LOT.

                For what it’s worth, I dont just take the scripts and paste them in, I’m always trying to understand what the code does, so I can be less reliant as time goes on.

          • Aux@feddit.uk
            link
            fedilink
            English
            arrow-up
            2
            ·
            29 days ago

            What kind of code are you writing that your CPU goes to sleep? If you follow any good practices like TDD, atomic commits, etc, and your code base is larger than hello world, your PC will be running at its peak quite a lot.

            Example: linting on every commit + TDD. You’ll be making loads of commits every day, linting a decent code base will definitely push your CPU to 100% for a few seconds. Running tests, even with caches, will push CPU to 100% for a few minutes. Plus compilation for running the app, some apps take hours to compile.

            In general, text editing is a small part of the developer workflow. Only junior devs spend a lot of time typing stuff.

            • xthexder@l.sw0.com
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              28 days ago

              Anything that’s per-commit is part of the “build” in my opinion.

              But if you’re running a language server and have stuff like format-on-save enabled, it’s going to use a lot more power as you’re coding.

              But like you said, text editing is a small part of the workflow, and looking up docs and browsing code should barely require any CPU, a phone can do it with fractions of a Watt, and a PC should be underclocking when the CPU is underused.

              • Aux@feddit.uk
                link
                fedilink
                English
                arrow-up
                1
                ·
                28 days ago

                What do you mean “build”? It’s part of the development process.

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        29 days ago

        Already there. The blackwall is AI-powered and Markov chains are most definitely an AI technique.