this post was submitted on 25 Aug 2023
22 points (100.0% liked)
LocalLLaMA
2248 readers
1 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don't know about you but my M1 Pro is a hellovalot faster than my 5800x in llama.cpp.
These CPUs benchmark similarly across a wide range of other tasks.
*No consumer Intel hardware is on that list.
The only widely available consumer hardware with AVX512 support is AMD's Zen4 (7000 series).
I think just about the only Apple computer that supports AVX512 is the 2019 mac pro.