n7gifmdn@lemmy.ca to Memes@lemmy.mlEnglish · 10 个月前Meta AIimage.nostr.buildimagemessage-square23fedilinkarrow-up11.11Karrow-down110
arrow-up11.1Karrow-down1imageMeta AIimage.nostr.buildn7gifmdn@lemmy.ca to Memes@lemmy.mlEnglish · 10 个月前message-square23fedilink
minus-squarem-p{3}@lemmy.calinkfedilinkarrow-up55arrow-down4·edit-210 个月前The quantized model you can run locally works decently and they can’t read any of it, which is nice. I use that one specifically https://huggingface.co/lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF/blob/main/Meta-Llama-3-8B-Instruct-Q4_K_M.gguf If you’re looking for a relatively user-friendly software to use it, you can look at GPT4All (open source) or LM Studio.
minus-squarepassepartout@feddit.delinkfedilinkarrow-up16·10 个月前If you’re ready to tinker a bit i can recommend Ollama for the backend and Open web UI for the frontend. They can also both run on the same machine. The advantage is that you can use your GPU to compute, which is a lot faster.
minus-squarem-p{3}@lemmy.calinkfedilinkarrow-up7·10 个月前You’re right, I thought they were but I checked their GitHub and LM Studio itself isn’t.
The quantized model you can run locally works decently and they can’t read any of it, which is nice.
I use that one specifically https://huggingface.co/lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF/blob/main/Meta-Llama-3-8B-Instruct-Q4_K_M.gguf
If you’re looking for a relatively user-friendly software to use it, you can look at GPT4All (open source) or LM Studio.
If you’re ready to tinker a bit i can recommend Ollama for the backend and Open web UI for the frontend. They can also both run on the same machine.
The advantage is that you can use your GPU to compute, which is a lot faster.
Pretty sure LM studio is not open source
You’re right, I thought they were but I checked their GitHub and LM Studio itself isn’t.