• MiddleAgesModem@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    2 days ago

    They think the LLM hallucination problem will be ironed out in a couple of years.

    That one is a tad more realistic than uploading human consciousness.

    • WraithGear@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 days ago

      not unless they pivot on the basic principles of the LLM’s, instead of attempting to force a square peg into a circle hole.

      • MiddleAgesModem@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        edit-2
        1 day ago

        Hallucinations have already been reduced. You’re expressing a pretty standard anti-LLM stance but it seems people in the field think the hallucination problem can be fixed. Even something as simple as having them say “I don’t know”. Better use of tools and sources.

        The fact that hallucination is less of a problem then it used to be should make it pretty clear that it’s not immutable.

        In any event, it’s still ridiculously more possible than artificially transferring human consciousness.