You must log in or # to comment.
glm is my locallama of choice, i use mistral le chat when not using it
Yes. GLM 4.5 is excellent. I mostly use that along with Qwen3 and DeepSeek 3.1 Terminus (Thinking) these days.
Yep, I mainly use GLM-4.6 now and it’s actually pretty good, I use it through API, not locally tho
Only used deepseek of these
Tried just now (not the local version), and it seems to be superior indeed.
Asked it on a relatively obscure scientific topic and it answered correctly and in detail. Most models fail miserably. Nice.





