• brandon@lemmy.zip
    link
    fedilink
    English
    arrow-up
    12
    ·
    10 months ago

    Does anyone else feel weird about the vast amounts of computing resources seemingly being wasted on generative AI services like Copilot? It might be different if this was something that could run locally without requiring an internet connection, but clearly we aren’t there yet. Copilot specifically reminds me too much of Siri from iOS 5, and it’s often really slow compared to competing services.

    Sucks that this will inevitably become more common, but hopefully it’ll be easy enough to remap to something useful.

    • lseif@sopuli.xyz
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      you can pre-trained llms locally with something like ollama, although in my experience, the response quality isnt great