Where can I buy this?
Edit: I realized after I commented this was the product page… My bad. It was more of a take my money now scenario
This is literally a product page to buy them
i wonder if the driver to run is compatible with linux.
Why wouldn’t it? (Like I’m thinking why would they support Microsoft, and the only other viable option is FreeBSD)
the world still uses windows heavily so adoption for the end consumer relies on it.
Try the link of the post you’re responding to.
I kinda want an individual consumer-friendly, low-end/mid-end alternative that can run my games and video editing software for very small projects… so far I’m only eyeing the Lisuan G100, which seems to fit that bill…
This seems cool though, other than AI, it could be used for distributed cloud computing or something of that sort
For inference only. NVIDIA GPU’s are so big because they can train models. Not just run them. All other GPU’s seem to lack that capacity.
You can train or fine-tune a model on any GPU. Surely, It will be slower, but higher VRAM is better.
No. The CUDA training stuff is Nvidia only.
Pytorch runs on HIP now.
AMD has been lying about that every year since 2019.
Last time I checked it didn’t. And it probably still doesn’t.
People aren’t buying NVIDIA if AMD would work too. The VRAM prices NVIDIA asks are outrageous.
I run llama.cpp and PyTorch on MI300s. It works really well.
Can you train on it too? I tried Pytorch on AMD once and it was awful. They promised mountains but delivered nothing. Newer activation functions were all broken.
llama.cpp is inference only, for which AMD works great too after converting to ONNX. But training was awful on AMD in the past.
We have trained transformers and diffusion models on AMD MI300s, yes.
These only work with ARM cpus I think
This product is no longer available.
and of course a Chinese company would never re-badge something and slap there own name on it
that’s some quality cope there
you seem to be projecting real fucking hard mister alibaba. good luck with your new Huawei GPU
aww will you look at that, little wasp is mad 🤣
try harder funny man. using yourself as a source for yourself that is fucking funny.
I don’t need to try harder, you’re raging as it is. Don’t want you to have a aneurysm.
like i said before you are the source of saying that im raging so yes you are trying hard just not hard enough or you would find something better to make up. maybe your just talking about cope because your just talking about yourself.
I love how you can’t help yourself but keep replying here. Keep on seething there little buddy, it’s adorable.
PCI-E 3.0, DDR4 memory, no drivers, no fans You would be better off any DDR4 CPU with a bunch of ram
When you definitely know the difference between what a CPU and a GPU does.
For 2000$ it “claims” to do 140 TOPS of INT8
When a Intel Core Ultra 7 265K does 33 TOPS of INT8 for 284$Don’t get me wrong, I would LOVE to buy a chinese GPU at a reasonnable price but this isn’t even price competitive with CPUs let alone GPUs.
Again, completely different purposes here.
Alright, lets compare it to another GPU.
According to this source , the RTX 4070 costs about 500$ and does 466 TOPS of INT8
I dont know if TOPS is a good measurement tho (I dont have any experience with AI benchmarking)
Now go look at the amount of VRAM it has.