In my case, there are 95 packages that depend on zlib, so removing it is absolutely the last thing you want to do. Fortunately though, GPT also suggested refreshing the gpg keys, which did solve the update problem I was having.

You gotta be careful with that psycho!

  • infeeeee@lemm.ee
    link
    fedilink
    arrow-up
    5
    ·
    11 months ago

    How will you call the “AI in traditional sense” when we finally arrive to that? You can’t call that AI, because that’s LLM now.

    I’m not against the natutral development of languages, I simply don’t like mislabeling things.

    • hitmyspot@aussie.zone
      link
      fedilink
      arrow-up
      2
      ·
      11 months ago

      Already AI is categorized. Likely it will be called true AI or complex AI or something similar. Like any technology, eventually the newer version will replace the old and it will just be called AI again.

    • Hamartiogonic@sopuli.xyzOP
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      What people mean by AI has changed. When Optical Character Recognition was new, that was considered AI. Nowadays it’s so common that it’s nothing special any more. When people talk about AI in the 2020s, they usually exclude OCR from the definition.

    • averyminya@beehaw.org
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      11 months ago

      It will likely either be called some different term like NPA (Neural Processing Agent which I just made up), or it will just get the AI term again and the colloquialism for LLM’s being known as AI will fade. It happens for all sorts of other things, and context also depends a lot. CPU used to be a colloquialism for computer, now CPU’s themselves are more commonly used and so the term has faded, outside of random SEO sales sites.

      It’s not really mislabeling, it is colloquial language. Society has ebbs and flows, the only thing ever wrong with these IMO is a failure of communication due to poor contextual understanding (and delivery). If something is explained in a proper sentence, it usually doesn’t much matter if you’re using academic language or colloquial terms. That is to say, you are absolutely right - the distinction is important, but we have the ability to code switch between average conversations that still get the point across and ones which use the exact specific terms and nitpick every single wrong word and why…

      Also, I believe it’s currently called AI specifically because they are masking it as an artificial intelligence. (Not that that makes it what the term was first coined to be) Bard, a named code execution using LLM’s trying to speak in natural language presenting information… It’s artificial, it’s intelligence, it’s presenting itself as such! /s-ish (this isn’t what I believe, it’s what I believe is how it’s being presented). This is more personified than ChatGPT or Bing Chat, but it’s not the first time we’ve had Chatbots get called AI.

      Moreover, I think the term artificial intelligence is fairly broad from a lingual perspective and extremely narrow from its creators point of view (which was extremely similar to how current “AI” functions today - input image recognition output responses, that machine just happened to be analog and ours are digital). So generative AI is almost more accurately called AI than anything else, especially LLM models. But that’s kind of limiting too considering the vast amount of science fiction that has explored AI that is explicitly more than being able to recognize whether the presented image is a circle or a square.

      I’m with you though, we should call LLM’s as they are, generative imaging is just that, but if they are put together and I can have what feels like a conversation and it can show me pictures of what it’s referencing… I’m not going to nitpick the nuances of what this AI is comprised of, I’m just going to call it an AI. What the internal functions are can change, it’s still… AI. Just like how if I were in front of security cameras and I was talking with a coworker about how there’s a technology that can track moving objects and label various things, I’m likely not going to use the specific terms that are compromised of algorithms referencing image models… It’s an AI that can identify things from live video. (mythicalAI for more on that)

      So, all in all I think it really comes down to a contextual presentation and the fact that artificial intelligence, by nature, is a series of constructions. It seems to me that there inherently cannot be a single “AI” because we have shown that there are a vast number of ways to reach artificial intelligences. And what AI “really is” changes based on who wrote about it or who manufactured it.