cross-posted from: https://lemmy.ml/post/20289663

A report from Morgan Stanley suggests the datacenter industry is on track to emit 2.5 billion tons by 2030, which is three times higher than the predictions if generative AI had not come into play.

The extra demand from GenAI will reportedly lead to a rise in emissions from 200 million tons this year to 600 million tons by 2030, thanks largely to the construction of more data centers to keep up with the demand for cloud services.

  • @LarmyOfLone@lemm.ee
    link
    fedilink
    English
    -12 months ago

    If I run some AI model on my GPU and power my computer via solar power and some batteries, am I actually contributing significantly to GHG emissions?

    Like what is the embodied energy of an AI model?

    As usual, pundits and scientists confuse what is and what could be with the truth. For example plastic recycling isn’t possible because “right now economics don’t make it profitable”. Meaning capitalism is killing us, not plastics. I suspect the same is true for AI.

    • @Rin@lemm.ee
      link
      fedilink
      English
      62 months ago

      The model has to be trained, refined, etc. You running it off grid isn’t the entire process, but I agree with you in a different sense.

      If not AI, then there would be some other kind of compute taking up server capacity. It’s on the data centers to solve this one, not AI.

      • @0laura@lemmy.dbzer0.com
        link
        fedilink
        English
        12 months ago

        yea but the models are already trained and noone pays to use the open source ones, so you’re not really contributing to the training greenhouse gas emissions if you use an open source gen ai model locally.

        • @Rin@lemm.ee
          link
          fedilink
          English
          22 months ago

          Training a model isn’t free. It takes money and compute. That’s also the greenhouse emissions. Even if you don’t pay for any model and run it locally using solar, you’ve still got to consider what came before.