I just installed llama.cpp on CachyOS after reading this article. It’s much faster and better than Ollama.
I just installed llama.cpp on CachyOS after reading this article. It’s much faster and better than Ollama.