• lunarul@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    20
    ·
    edit-2
    6 months ago

    LLMs reproduce the form of language without any meaning being transmitted. That’s called parroting.

    Even if (and that’s a big if) an AGI is going to be achieved at some point, there will be people calling it parroting by that definition. That’s the Chinese room argument.

      • lunarul@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        5
        ·
        5 months ago

        Me? How can I move goalposts in a single sentence? We’ve had no previous conversation… And I’m not agreeing with the previous poster either…

        • Prunebutt@slrpnk.net
          link
          fedilink
          English
          arrow-up
          6
          ·
          5 months ago

          By entering the discussion, you also engaged in the previops context. The discussion uas about LLMs being parrots.

          • lunarul@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            5 months ago

            And the argument was if there’s meaning behind what they generate. That argument applies to AGIs too. It’s a deeply debated philosophical question. What is meaning? Is our own thought pattern deterministic, and if it is, how do we know there’s any meaning behind our own actions?

            • Prunebutt@slrpnk.net
              link
              fedilink
              English
              arrow-up
              3
              ·
              5 months ago

              The burden of proof lies on the people making the claims about intelligence. “AI” pundits have supplied nothing but marketing-hype.