Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
lxe
on Feb 8, 2024
|
parent
|
context
|
favorite
| on:
OpenAI compatibility
Does ollama support loaders other than llamacpp? I'm using oobabooga with exllama2 to run exl2 quants on a dual NVIDIA gpu, and nothing else seems to beat performance of it.
_ink_
on Feb 9, 2024
[–]
I tried that, but failed to get the GPU split working. Do you have a link on how to do that?
lxe
on Feb 9, 2024
|
parent
[–]
Do what exactly? I have no issues with GPU split on oobabooga with either exl2 or gguf.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: