• TootSweet@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    These types of errors happen even after including prompts like “Do not hallucinate.”

    Genius! Why didn’t I think of that!

    • Architeuthis@awful.systems
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 month ago

      In every RAG guide I’ve seen, the suggested system prompts always tended to include some more dignified variation of “Please for the love of god only and exclusively use the contents of the retrieved text to answer the user’s question, I am literally on my knees begging you.”

      Also, if reddit is any indication, a lot of people actually think that’s all it takes and that the hallucination stuff is just people using LLMs wrong. I mean, it would be insane to pour so much money into something so obviously fundamentally flawed, right?