• finley@lemm.ee
    link
    fedilink
    English
    arrow-up
    153
    ·
    edit-2
    3 months ago

    Fucking do it. Anything that takes down Nvidia’s CUDA Monopoly has my full support.

    • jabjoe@feddit.uk
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      35
      ·
      edit-2
      3 months ago

      As a general rule, don’t use a corporation’s language. Languages, and their reference implementation, should be truly independent.

      Edit: To be clear, programming language.

        • jabjoe@feddit.uk
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          2
          ·
          edit-2
          3 months ago

          Corporation? I’m not anti business, far from it. But I have an interest in economics as well as technology. We need effective markets. CUDA is an example of a market problem caused by a corporation’s own language. It has screwed up competition.

          • JustARaccoon@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            3 months ago

            I mean if anything, look at Velcro and how generalising a term makes it untrademarkeable. Overusing words can and will screw companies.

            • kureta@lemmy.ml
              link
              fedilink
              English
              arrow-up
              17
              ·
              3 months ago

              He means programming language. Don’t use programming languages that are controlled by a single company. Not “don’t say CUDA when you mean any general purpose GPU programming language”.

            • Jakeroxs@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              3 months ago

              The funny thing with that, I haven’t seen a term taken like that from a tech company though.

              Xerox is the only one I can think of that came close, Googling at this point…

        • Telodzrum@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          36
          ·
          3 months ago

          Down 6% YTD and a market cap of 1/12 of their only competitor isn’t “doing great.”

          • TheGrandNagus@lemmy.world
            link
            fedilink
            English
            arrow-up
            49
            arrow-down
            3
            ·
            edit-2
            3 months ago
            • AMD’s Data Center revenue surged 115% YoY in Q2 2024, driven by Instinct MI300 AI accelerators and EPYC processors, leading to raised revenue guidance and a BUY rating.
            • Client segment revenue grew 49% YoY, fueled by strong Ryzen processor demand and Zen 5 sales, prompting an increased revenue forecast for 2024 and 2025.
            • Gaming segment revenue fell 59% YoY due to weak console demand, resulting in a lowered revenue forecast for this segment through 2025.
            • AMD’s acquisitions of Silo AI and ZT Systems aim to strengthen its AI and data center market position, enhancing long-term growth prospects.

            Did you even read it?

            Down 6% YTD and a market cap of 1/12 of their only competitor isn’t “doing great.”

            Firstly, Nvidia isn’t AMD’s only competitor. There’s Intel, Qualcomm, and a litany of other companies that compete with Xilinx, who AMD purchased.

            Secondly, only looking at the stock price and treating that as gospel shouldn’t be how you’re assessing whether a company is doing well or not.

            GameStop stock exploded, it wasn’t because they were doing well. Nvidia became the most valuable company on earth by market cap - do you really think they’re that valuable, more than Microsoft or Apple, or is it a bubble? Tesla stock went crazy because Elon kept chatting nonsense about how all their cars would be driverless by 2018. Etc. shit, go back enough, tulip stocks were worth more than all of the textiles industry if you only care about market cap.

            The stock market is not rational. Don’t judge the health of a company purely based on their stock price.

            • Telodzrum@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              63
              ·
              3 months ago

              Did you even read it?

              Well, it’s accountwalled so no.

              Additionally, holy shit my dude. Nice work being Exhibit 1A as to why retail investors and common folk shouldn’t be allowed to speak about the stock market, corporate valuation, or business finances generally.

              • andyburke@fedia.io
                link
                fedilink
                arrow-up
                32
                ·
                3 months ago

                Can’t take an L and reevaluate or something? Why this response to a pretty reasonable rebuttal of your surface take?

              • TheGrandNagus@lemmy.world
                link
                fedilink
                English
                arrow-up
                24
                ·
                3 months ago

                The irony of saying that when you think the be-all and end-all of how good a business is doing is looking at its stock price this year.

                If only it were that simple.

          • iopq@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            3 months ago

            My man, by the time I read this comment it was 1% down YTD

            Shows how much you know, talking about returns over a few months

            It’s up 361% over last 5 years, which is beating $QQQ (tech stocks in general) which is up 138% and $INTC which is down 63%

            $NVDA is up 2295% over that time frame and it’s a ridiculous comparison, almost no company grew that fast

            $AAPL must be shit since it only grew 309% over 5 years, right?

  • TheFeatureCreature@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    ·
    edit-2
    3 months ago

    Anything to help them take on Nvidia and stay competitive is a good move. However, I wish they would also announce a recommitment to driver and software stability. I had to move to Nvidia for my workstation rig after having constant stability issues with numerous AMD cards across multiple builds. I can handle a few rough edges or performance that isn’t top-of-the-line but I can’t put up with constant crashes ad driver timeout errors. It’s annoying in games and devastating when I’m working.

    I wish their GPU line received even a portion of the polish and care that their CPU line did.

    • TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      3 months ago

      Damn, I’ve had the exact opposite experience. I had to move away from a 1080 Ti that I was having constant instability with, even after I went back to the retailer and got a new card.

      Unfortunately at the time, AMD didn’t have anything performance competitive. But it was worth the downgrade for the better drivers.

      • fuckwit_mcbumcrumble@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        3 months ago

        I had to move away from a 1080 Ti that I was having constant instability with, even after I went back to the retailer and got a new card.

        Was it the card or was it something else? Any chance you have a 13th or 14th gen Intel CPU?

    • Refurbished Refurbisher@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      3 months ago

      As a Linux user, I had to trade in my Nvidia laptop for one with an AMD GPU due to how unstable the Nvidia drivers were and how many problems they were giving me. With the AMD laptop, I have had zero issues.

      • RussA
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        I did the same move for similar reasons! Although I still keep windows around on another SSS - and even the Windows Nvidia drivers were being funky for me.

        Nvidia shares a lot of logic between their Windows and Linux driver as far as I’m aware, so I suppose it makes sense.

    • schizo@forum.uncomfortable.business
      link
      fedilink
      English
      arrow-up
      7
      ·
      3 months ago

      The annoying part is their drivers are stable…sometimes.

      Its an endless game of seeing if any specific version is broken in a way that annoys you and rolling back if you find an issue.

      Not exactly a premium experience.

      • ayaya@lemdro.id
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        Even on Linux where their drivers are supposed to be better, my 7900XTX has been crashing randomly for at least a month and it was only fixed in the latest 6.10.9 kernel release yesterday.

        • schizo@forum.uncomfortable.business
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          Yeah I’ve heard the ‘AMD drivers are better!’ thing for Linux and have always been confused since I’ve had no issues with nVidia cards on Linux or Windows related to driver issues.

          AMD stuff on the other hand, has been a mess non stop, except for my ROG Ally for some reason which is fine?

          In short: computers suck and are unpredictable, or something.

      • TheFeatureCreature@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        Yeah, this too. My dad’s last GPU was AMD and he had to flip flop between versions to fix crashes. I wasn’t as lucky as no driver version was able to calm the crashing.

  • jaxiiruff@lemmy.zip
    link
    fedilink
    English
    arrow-up
    22
    ·
    3 months ago

    This checks out after what they recently did to the ZLUDA project. As an owner of an AMD gpu I agree that ROCM support is really bad. It works half of the time and fairly poorly.

    • DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      4
      ·
      3 months ago

      Even worse on Linux. Even worse on more exotic distros like Bazzite where I still can’t get koboldcpp to run, which was already kind of a hassle on my previous distro.

      • jaxiiruff@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        Yeah I hear ya. How do you happen to be running it? I use NixOS and its a challenge there for me but I found atleast some success using docker since the dependencies are so out of control for AI at the moment.

        Also give Ollama-rocm + Open Webgui a shot as an alternative to koboldcpp if you cant get atleast some text generation to work because that is the only thing I have got working with rocm.

        • barsoap@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 months ago

          The trick to nixos, in this instance, is to use a python venv. Python dependencies are fickle and nasty in the first place, triply so when talking about fast-churning AI code, I tried specifying everything with nix, I succeeded, and then you have random comfyui plugins assuming they can get a writeable location by constructing a path from comfyui’s main.py. It’s not worth it: Let python be the only dependency you feed in, let pip and general python jank do the rest.

        • DarkThoughts@fedia.io
          link
          fedilink
          arrow-up
          1
          ·
          3 months ago

          At the moment I just don’t. I got kobolcpp to run through distrobox / boxbuddy but I can’t get it to compile with rocm, so I can only use CPU generation, which is abysmally slow. Might go back to NovelAI when they release their new model if I can’t find a solution.

          • RussA
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 months ago

            What card do you use? I have a 6700XT and getting anything with ROCM running for me requires that I pass the HSA_OVERRIDE_GFX_VERSION=10.3.0 environmental variable to the related process, otherwise it just refuses to run properly. I wonder if it might be something similar for you too?

            • barsoap@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              3 months ago

              5500 here. I can’t use any recent rocm version because the GFX override I use is for a card that apparently has a couple more instructions and the newer kernels instantly crash with an illegal operation exception.

              I found a build someone made buried in a docker image and it indeed does work, without override, for the 5500 but it’s using all generic code for the kernels and is like 4x slower than the ancient version.

              What’s ultimately the worst thing about this isn’t that AMD isn’t supporting all cards for rocm – it’s that the support is all or nothing. There’s no “we won’t be spending time on this but it passes automated tests so ship it” kind of thing. “oh the new kernels broke that old card tough luck you don’t get new kernels”.

              So in the meantime I’m living with the occasional (every couple of days?) freeze when using rocm because I can’t reasonably upgrade. Not just the driver crashes, the kernel tries to restart it, the whole card needs a reset before doing anything but display a vga console.

              • RussA
                link
                fedilink
                English
                arrow-up
                1
                ·
                3 months ago

                Yeah, I definitely am not a fan of how AMD handles rocm - there’s so many weird cases of “Well this card should work with rocm, but… [insert some weird quirk that you have to do, like the one I mentioned, or what you’ve run into]”.

                Userspace/consumer side I enjoy AMD, but I fully understand why a lot of devs don’t make use of rocm and why Nvidia has such a tight hold on things in the GPU compute world with CUDA.

            • DarkThoughts@fedia.io
              link
              fedilink
              arrow-up
              1
              ·
              3 months ago

              6650 XT. Honestly no idea. When I run make LLAMA_HIPBLAS=1 GPU_TARGETS=gfx1032 -j$(nproc) in the Fedora distrobox on kobolcpp it throws a bunch of

              fatal error: 'hip/hip_fp16.h' file not found
                 36 | #include <hip/hip_fp16.h>
              

              errors and koboldcpp does not give an option to use Vulkan.

              • RussA
                link
                fedilink
                English
                arrow-up
                1
                ·
                3 months ago

                Ah, strange. I don’t suppose you specifically need a Fedora container? If not, I’ve been using this Ubuntu based distrobox container recipe for anything that requires ROCM and it has worked flawless for me.

                If that still doesn’t work (I haven’t actually tried out kobolcpp yet), and you’re willing to try something other than kobolcpp, then I’d recommend the text-generation-webui project which supports a wide array of model types, including the GGUF types that Kobolcpp utilizes. Then if you really want to get deep into it, you can even pair it with SillyTavern (it is purely a frontend for a bunch of different LLM backends, text-generation-webui is one of the supported ones)!

                • DarkThoughts@fedia.io
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  3 months ago

                  I don’t think so, it’s just what I’m more familiar with and I usually try to avoid apt’s PPA hell as much as I can. But maybe I should try some others, as I couldn’t get Mullvad to run either yet. :/

                  Text gen web ui I tried quite a while before I went to koboldcpp on my previous distro and I could not get that to run without crashing whenever I tried to generate anything. Sillytavern is my standard frontend that I use, so any text gen software should inherently compatible with that anyway.

  • AmidFuror@fedia.io
    link
    fedilink
    arrow-up
    18
    arrow-down
    1
    ·
    3 months ago

    CDNA is actually DNA made using RNA as a template. Very important for the viral ecosystem.

  • lustyargonian@lemm.ee
    link
    fedilink
    English
    arrow-up
    8
    ·
    3 months ago

    When AMD moved on from its GCN microarchitecture back in 2019, the company decided to split its new graphics microarchitecture into two different designs, with RDNA designed to power gaming graphics products for the consumer market while the CDNA architecture was designed specifically to cater to compute-centric AI and HPC workloads in the data center.

    I wonder if CDNA will be more akin to Tensor Cores on RTX GPUs, leading to better ray tracing performance of gaming.

    • barsoap@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 months ago

      Tensor cores have nothing to do with raytracing. They’re cut-down GPU cores specialising in tensor operations (hence the name) and nothing else. Raytracing is accelerated by RT cores, doing BVH traversal operations and ray intersections, the tensor cores are in there to run a denoiser to turn the noisy mess that real-time RT produces into something that’s, well, not messy. Upscaling, essentially, the only difference between denoising and upscaling is that in upscaling the noise is all square.

      And judging by how AMD has done this stuff before nope they won’t do separate cores, but make sure that the ordinary cores can do all that stuff well.

      • lustyargonian@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        Oh I see. So DLSS and especially ray reconstruction uses tensor core, would that be right?

        I guess then it may be better to keep expectations low.

        • barsoap@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 months ago

          Yep that’s what nvidia marketing seems to be calling their denoiser nowadays. Gods spare us marketing departments.

  • fuckwit_mcbumcrumble@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    Fermi 2.0?

    Or is this going to be more like kepler where the consumer grade stuff doesn’t have all of the power hungry features, and the datacenter stuff gets them?