• pezhore@infosec.pub
    link
    fedilink
    arrow-up
    6
    ·
    22 days ago

    That’s why I’ve stopped using non-local LLMs. Ollama works just fine on my outdated GTX 2060.