• ikt@aussie.zone
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 month ago

    glm is my locallama of choice, i use mistral le chat when not using it

  • FrankLaskey@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 month ago

    Yes. GLM 4.5 is excellent. I mostly use that along with Qwen3 and DeepSeek 3.1 Terminus (Thinking) these days.

  • Fitik@fedia.io
    link
    fedilink
    arrow-up
    4
    ·
    1 month ago

    Yep, I mainly use GLM-4.6 now and it’s actually pretty good, I use it through API, not locally tho

  • Pika@rekabu.ru
    link
    fedilink
    English
    arrow-up
    2
    ·
    19 days ago

    Tried just now (not the local version), and it seems to be superior indeed.

    Asked it on a relatively obscure scientific topic and it answered correctly and in detail. Most models fail miserably. Nice.