Track_Shovel to Lemmy Shitpost@lemmy.worldEnglish • 2 months agoHexadecimalslrpnk.netimagemessage-square109fedilinkarrow-up11.05Karrow-down10
arrow-up11.05Karrow-down1imageHexadecimalslrpnk.netTrack_Shovel to Lemmy Shitpost@lemmy.worldEnglish • 2 months agomessage-square109fedilink
minus-square@vvilld@lemmy.worldlinkfedilink1•2 months agoI meant, is hosting it locally something someone without a coding background can do easily
minus-square@Fillicia@sh.itjust.workslinkfedilink2•2 months agoWithout a coding background, yes. For someone technically illiterate it might be an issue. You can get a good starting point looking at Ollama
minus-square@rumba@lemmy.ziplinkfedilinkEnglish2•2 months agoOh, Yeah it’s not bad. You can install Ollama, docker, then install open-webui in docker. Tell openwebui to go get deepseek instructions: https://archive.is/fOWXO or you can try pinokio.computer or jan.ai
minus-square@Raptorox@sh.itjust.workslinkfedilink1•1 month agoA really simple way is to use LM Studio. You just install and select deepseek-r1, default is 7B iirc
I meant, is hosting it locally something someone without a coding background can do easily
Without a coding background, yes.
For someone technically illiterate it might be an issue.
You can get a good starting point looking at Ollama
Oh, Yeah it’s not bad.
You can install Ollama, docker, then install open-webui in docker. Tell openwebui to go get deepseek
instructions: https://archive.is/fOWXO
or you can try pinokio.computer or jan.ai
A really simple way is to use LM Studio. You just install and select deepseek-r1, default is 7B iirc