Does anyone else feel weird about the vast amounts of computing resources seemingly being wasted on generative AI services like Copilot? It might be different if this was something that could run locally without requiring an internet connection, but clearly we aren’t there yet. Copilot specifically reminds me too much of Siri from iOS 5, and it’s often really slow compared to competing services.
Sucks that this will inevitably become more common, but hopefully it’ll be easy enough to remap to something useful.
Does anyone else feel weird about the vast amounts of computing resources seemingly being wasted on generative AI services like Copilot? It might be different if this was something that could run locally without requiring an internet connection, but clearly we aren’t there yet. Copilot specifically reminds me too much of Siri from iOS 5, and it’s often really slow compared to competing services.
Sucks that this will inevitably become more common, but hopefully it’ll be easy enough to remap to something useful.
you can pre-trained llms locally with something like
ollama
, although in my experience, the response quality isnt great