• sexy_peach@feddit.org
    link
    fedilink
    arrow-up
    33
    ·
    10 days ago

    Someone said that calling their misinfo hallucinations is actually genius. Because everything they say is the hallucination, even the things we read and think are correct. The whole thing hallucinates away and then we go and say ok some of this makes a lot of sense, but the rest…

    So basically that’s why it will never be 100% correct all the time, because all of the output is just more or less correct hallucination.

    • ✺roguetrick✺@lemmy.world
      link
      fedilink
      arrow-up
      15
      ·
      10 days ago

      Human pattern recognition making the insane machine seem like it’s making sense. Astrology but with venture capital backing. I like it.

    • Pelicanen@sopuli.xyz
      link
      fedilink
      arrow-up
      12
      ·
      10 days ago

      So basically that’s why it will never be 100% correct all the time, because all of the output is just more or less correct hallucination.

      This is completely correct, it does the exact same thing when it works as people expect as it does when it’s “hallucinating”.

    • technocrit@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      4
      ·
      9 days ago

      The problem with “hallucinations” is that computers don’t hallucinate. It’s just more anthropomorphic grifter hype. So, while it sounds like a criticism of “AI”, it’s just reinforcing false narratives.