If you’re just wanting to run LLMs quickly on your computer in the command line, this is about as simple as it gets. Ollama provides an easy CLI to generate text, and there’s also a Raycast extension for more powerful usage.

  • silas@programming.devOP
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    1 year ago

    Can’t speak to that much because I haven’t reviewed the code myself, but it’s open-source and everything runs locally on your machine without network requests