

2·
14 days agoThe main limitation is the VRAM, but I doubt any model is going to be particularly fast.
I think phi3:mini
on ollama might be an okish fit for python, since it’s a small model, but was trained on python codebases.
The main limitation is the VRAM, but I doubt any model is going to be particularly fast.
I think phi3:mini
on ollama might be an okish fit for python, since it’s a small model, but was trained on python codebases.
Next time just shoot an movie industry exec, the sentence will be the same. /s (mandatory: don’t actually shoot anyone please)