he/they

  • 86 Posts
  • 833 Comments
Joined 2 years ago
cake
Cake day: February 2nd, 2024

help-circle













  • If the AI bros get these things built before the bubble pops, we should expect bad designs, not allowing for local conditions, setting up the reactor operators for ridiculous errors, and lots of nuclear accidents. Hopefully not very big ones. Cross fingers!

    If and when those accidents start happening, its going to set back adoption of nuclear power by years, if not decades - especially if there’s an incompetent response to those accidents (which, considering Starmer and Trump are in charge, is worryingly likely)

    One of the other important things about the nuclear regulation process is that it makes sure the local people are involved. You can’t skip that step either. If you just run roughshod over the locals for the sake of AI, you’ve already got people in the streets protesting AI.

    The government will almost certainly try to just bypass the activists. But remember: anti-nuclear activists have decades of experience at this. So I’m sure it’ll go great all round.

    Anti-nuclear activists are going to have a field day with this, aren’t they?








  • I’m gonna say it: The entire “artificial intelligence”/“machine learning” research field is corrupt. They have institutionally accepted the bullshit fountain as a tool. It doesn’t matter if they’re only using chatbots as a “pilot program”; they’ve bought into the ideology. They’ve granted fashtech a seat at the bar and forced all the other customers to shake its hand.

    Abso-fucking-lutely. Oxford’s latest “research paper” isn’t marketing - its propaganda. Propaganda for bullshit fountains, and for the ideology which endorses them.