minus-squaretheotherbelow@lemmynsfw.comtoSelfhosted@lemmy.world•Can local LLMs be as useful and insightful as those widely available?linkfedilinkEnglisharrow-up2·17 hours ago100% you don’t have to train a thing ollama uses open availability models. They many are decent, the best use a lot of ram/vram. linkfedilink
100% you don’t have to train a thing ollama uses open availability models. They many are decent, the best use a lot of ram/vram.