This conversation and the reactions it caused made me think of a few tips to explicitly veer away from AI-aided dystopias in your fictional universe.

Avoid a monolithic centralized statist super-AI

I guess ChatGPT is the model people use, the idea that there is a supercomputer managing all aspects of a community. And people are understandably wary of a single point of control that could too easily lead to totalitarianism

Instead, have a multitude of transparent local agents managing different systems. Each with a different algorithm and “personality”.

Talk about open source

The most used AI models today are open source. We have a media that is biased towards thinking that things that do not generate commercial transactions are not important yet I am willing to bet that more tokens are generated by all the free models in the world than by OpenAI and its commercial competitors.

AIs are not to be produced by opaque companies from their ivory towers. They are the result of researchers and engineers who have a passion for designing smart system and --a fact that is too often obscured by the sad state of our society where you often have to join a company to make a living-- they do it with a genuine concern for humanity’s well being and a desire that this work is used for the greater good.

It is among AI engineers that you will find the most paranoids about AI safety and safeguards. In a solarpunk future, this is a public debate and a political subject that is an important part of the policy discussion: We make models together, with incentives that are collectively agreed upon.

AIs are personal

You don’t need a supercomputer to run an AI. LLMs today run on relatively modest gaming devices, even on raspberry pi! (though slowly at the moment). Energy-efficient chips are currently being designed to make the barrier of entry even lower.

It is a very safe bet to say that in the future, every person will have their own intelligent agent managing their local devices. Or even one agent per device and an orchestrator on their smartphone. And it is important that they are in complete control of these.

AIs should enhance humans control over their own devices, not make them surrender it.

AIs as enablers of democracy

You not only use your pocket AI to control your dishwasher, it is also your personal lawyer and representative. No human has the bandwidth to go through all the current policy debates happening in a typical country or even local community. But a well designed agent that spends time discussing with you will know your preferences and make sure to represent them.

It can engage in discussions with other agents to find compromises, to propose or oppose initiative.

As everyone’s opinion is now included in every decision about road planning, public transportation, construction schedules and urban development, the general landscape will organically grow friendlier for everybody.

  • 370W is quite a bit of electricity and that could scale up as people run more computational heavy AI or want shorter runtimes. 500-1000W PSUs are common. For reference, a raspberry pi 4 consumes 2.5-4W under a normal load. The average American home consumes 30-50kWh per day which works out to ~1-2kW per day. So 370W accounts for 19% to 37% of an average American’s electricity consumption.

    We need to be careful with the eco-modernist perspectives on AI since we’re already seeing an increased demand for power which is one of the main growth-related issues impacting climate change. AI certainly has a place, especially for medical research. However there are also many talented human artists who are looking for work. Meanwhile, image generation drives work efficiency which is far from essential.

    Critically appraising the accelerationism around AI by no means makes one a luddite. I say this as someone deeply entrenched in tech.

    • max@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      arent raspi 4s more like a 10-12W when ur using for desktop ? have i gotten it confused with the raspi 5s idk >~< im still got a 3b+ and a B model lmao i stick by the reduce reuse before buying new. i do agree tho that 370W is quite a lot for a computer, my main laptop is 150w ans it is incredibly snappy + has a dgpu (nVidia GTx 1060M) so id honestly consider that the max power consumption for a computer, i dont really see y u’d need more. I have an old lenovo not-quite-thinkpad that was like 150$ and was just before soldered ram + emmc came online so ive managed to upgrade it to 8gb ram (it came w 2 [WITH WIN 10 !! ??]) but is dualcore 2.1ghz and is a bit slow for my tastes but can do word processing and some web browsing but it has a usual power consumption of 7W off the battery and about 20W when plugged in + charging so im very happy with that old thing :3

    • keepthepace@slrpnk.netOP
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      11 months ago

      370W is peak consumption for my computer which is oversized if you only want to run inference. I also want to run training. If LLMs become a commodity they will likely run on specialized hardware that is unlikely to eat more than 20W while running, which is likely to be less than 1% of the time. I used my 370W figure to state something I am sure of, and to show that even without any optimization effort, this is at worse a very manageable amount.