• SmokeyDope@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    3 hours ago

    You can say the exact same thing about most humans you might want to ask advice from, like being biased and capable of lies of ommision would disqualfy 90% of our species easy if not more.

    I would agree with taking llms advice with at most only a grain of salt but for the reason they have no actual human life experience or true sentience. A schoastic parrot can boilerplate a good suggestion but its up to you to be responsible for accountability and scrutinize it like all advice as a proper aware and thinking being.

    • quacky@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      24 minutes ago

      It would help if the LLMs didnt spit out suggestions, reconmendations, tips, next steps as even the mere mention of it can influence you. All of their advice are biased to whatever the weirdos who own them want it to be, and they work with local police and the trump administration

  • NutinButNet@hilariouschaos.com
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    5 hours ago

    The ones I use do use citations, though I do sometimes find dead links and I have called it out on that so it can provide another link for the information it is sharing.

    No one should blindly trust any LLM or any source for that matter. Should be, at the very least, 2 unrelated sources saying the same thing, with an LLM not being one of the two since it is just regurgitating information.

    But it really depends on the usage and how serious it is to you. For health stuff? Should definitely seek multiple unrelated sources. Wanting to know some trivia about your favorite movie? Maybe check 1 of the sources if it’s that important to you. Definitely should get the source if writing a paper for school, etc.

    I use the LLM as a search engine, of sorts, and rely on the sources it shares more than the information it provides in some of the work I do. I sometimes find it easier than using my methods which now don’t always work or for situations like describing something with a lot of words.

    • feddylemmy@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      3 hours ago

      I use it as a search engine too. You can ask most LLMs to cite a claim. Then you can evaluate the claim based on the credibility of the source. They’re somewhat decent at summarizing and the corollary is that they’re somewhat decent at going through a lot of material to find something that might be what you’re looking for. It’s all about the verification of the source though.

    • quacky@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      4 hours ago

      The ones that provide sources, like perplexity or gemini, seem to include blogs and reddit posts. They cant differentiate a good source from a bad one.

      Also this relates back to the bias problem because even if they are citing credible sources, their interpretation is unreliable as well

  • iii@mander.xyz
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    8
    ·
    6 hours ago

    If those are the criteria, then humans should not give advise neither