Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
How to Deploy LLM Locally (lyc8503.net)
2 points by todsacerdoti 3 months ago | hide | past | favorite | 1 comment


Ollama is very convenient, but I advise you not to try it, because the capabilities of local models are really poor.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: