minus-squareScott@sh.itjust.workstoSelfhosted@lemmy.world•Self-hosting LLMslinkfedilinkEnglisharrow-up2arrow-down1·10 days agoI got a home server with a Nvidia Tesla P4, not the most power or the most vram (8gb), but can be gotten for ~$100usd (it is a headless GPU so no video outputs) I’m using ollama with dolphin-mistral and recently deepseek coder linkfedilink
I got a home server with a Nvidia Tesla P4, not the most power or the most vram (8gb), but can be gotten for ~$100usd (it is a headless GPU so no video outputs)
I’m using ollama with dolphin-mistral and recently deepseek coder