The Apple M5 Pro chip is rumored to be announced later this week with improved GPU cores and faster inference. With consumer hardware getting better and better and AI labs squishing down models to fit in tiny amounts of vRAM, it’s becoming increasingly more feasible to have an assistant which has absorbed the entirety of the internet’s knowledge built right into your PC or laptop, all running privately and securely offline. The future is exciting everyone, we are closing the gap

  • kata1yst@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 天前

    Totally agreed. LLMs shouldn’t be asked to know things, it’s counterproductive. They should be asked to DO things, and use available tools to do that.