I’ve been running some LLMs on my 5600x and 5700g cpus, and the performance is… ok but not great. Token generation is about “reading out loud” pace for the 7&13 B models. I also encounter occasional system crashes that I haven’t diagnosed yet, possibly due to high RAM utilization, but also possibly just power/thermal management issues.
A 50% speed boost would probably make the CPU option a lot more viable for home chatbot, just due to how easy it is to make a system with 128gb RAM vs 128gb VRAM.
I personally am going to experiment with the 48gb modules in the not too distant future.
You could put an 8700G in the same socket. The CPU isn't much faster but it has the new NPU for AI. I'm thinking about this upgrade to my 2400G but might want to wait for the new socket and DDR5.
I looked at upgrading my existing AMD based system's ram for this purpose, but found out my mobo/cpu only supports 128gb of ram. Lots, but not as much as I had hoped I could shove in there.
A 50% speed boost would probably make the CPU option a lot more viable for home chatbot, just due to how easy it is to make a system with 128gb RAM vs 128gb VRAM.
I personally am going to experiment with the 48gb modules in the not too distant future.