• T156@lemmy.world
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    1
    ·
    7 months ago

    It’s arguably worse, since it seems to be more pervasive than crypto and NFTs were at their peak.

    Crypto never really hit the mainstream, and even NFTs were still fringe. Whereas AI and AI accelerators are packed into basically every new phone and (Intel) processor.

    • thehatfox@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      7 months ago

      Regulatory hurdles kept crypto out of most mainstream products. There are no such barriers for AI, and any that are put up may come too late.

      There are also more possible mainstream use cases for AI - if the technology works as promised. That’s the biggest for AI currently, and some products like the Humane Pin are already tripping over it.

    • Grimy@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      7 months ago

      There are way more uses cases to the average person than crypto so that’s only natural. There’s also a trust issue with crypto that doesn’t exist with AI, as well as losing your money when things go wrong.

      That being said, I don’t approve of this nor adding it randomly to products where it clearly has little use. If people want generative software, they can just choose to install it.

      • A Phlaming Phoenix@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        ·
        7 months ago

        There’s a trust issue here as well since AI only works if you train it and we are training it with our activity, reported to private companies who can do whatever they please with it. I don’t trust anything Microsoft does.

        • Grimy@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          7 months ago

          I meant more a trust issue in the sense that it’s hard for people to feel safe putting their money into crypto. A lot of the coins are scammy and even some of the exchanges don’t really look legit.

          In terms of privacy and collecting data which is what I feel like you are referring too, the general population sadly just doesn’t give a shit. Most really don’t care about what’s being done with their data.

    • Sconrad122@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      3
      ·
      7 months ago

      Why call out Intel? Pretty sure AMD and Nvidia are both putting dedicated AI hardware in all of their new and upcoming product lines. From what I understand they are even generally doing it better than Intel. Hell, Qualcomm is advertising their AI performance on their new chips and so is Apple. I don’t think there is anyone in the chip world that isn’t hopping on the AI train

      • T156@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 months ago

        Because I was only aware of Intel (and Apple) doing it on computers, whereas most major flagship mobile devices have those accelerators now.

        GPUs were excluded, since they’re not as universal as processors are. A dedicated video card is still by and large considered an enthusiast part.

        • Sconrad122@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          7 months ago

          Fair enough. Was just asking because the choice of company surprised me. AMD is putting "AI Engines in their new CPUs (separate silicon design from their GPUs) and while Nvidia largely only sells GPUs that are less universal, they’ve had dedicated AI hardware (tensor cores) in their offerings for the past three generations. If anything, Intel is barely keeping up with its competition in this area (for the record, I see vanishingly little value in the focus on AI as a consumer, so this isn’t really a ding on Intel in my books, more so making the observation from a market forces perspective)

        • Sconrad122@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          You’re not wrong that GPU and AI silicon design are tightly coupled, but my point was that both of the GPU manufacturers are dedicating hardware to AI/ML in their consumer products. Nvidia has the tensor cores in its GPUs that it justifies to consumers with DLSS and RT but we’re clearly designed for AI/ML use cases when they presented them with Turing. AMD has the XDNA AI Engine that it is putting its APUs separate from its RDNA GPUs