• Alto@kbin.social
    link
    fedilink
    arrow-up
    46
    arrow-down
    4
    ·
    11 months ago

    Why are any of you believing what the Intel CEO says about their own products when there’s absolutely not a shred of anything to back it up. “Man who stands to make large amounts of money by lying lies” is not exactly an uncommon occurrence.

    • Centillionaire@kbin.social
      link
      fedilink
      arrow-up
      10
      arrow-down
      1
      ·
      11 months ago

      One reason to believe is because they aren’t allowed to lead on investors with false claims. It would be a big deal if the product severely fell short of what he’s saying now.

      • assassinatedbyCIA@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        11 months ago

        Good thing no CEO ever does that staring intently at Elon Musk. Even better that they are always punished for it my eyes start turning red and stinging as the intensity of the staring increases

      • Alto@kbin.social
        link
        fedilink
        arrow-up
        10
        arrow-down
        2
        ·
        11 months ago

        Except they’re not promising anything other than “it’s ahead”. What does that mean? Only the node size is smaller? Peak performance? Performance per watt? “It’s ahead” doesn’t mean anything of substance and gives him all the legal leeway in the world.

        • olosta@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          11 months ago

          He’s not even promising that, he’s saying he “thinks” he is ahead. And he tells nothing about yields, having the greatest node is useless if you can’t deliver volume.

    • Joker@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Gelsinger is alright. Definitely not a clueless PHB. He’s an engineer who designed Intel’s processors back in their heyday. He spent more than a decade rising up the ranks on the tech side before he got into management. This guy was a force in the microprocessor industry. They brought him back to right the ship and get back to and engineering culture after the PHB’s let the tech languish.

      I don’t recall him really saying anything positive about the tech since he came back. I mean, they’re still selling what they have and making the most of it because that’s what they have to do. But it’s not like he’s been hyping anything. If he’s saying now that they have a good chip then I’m inclined to believe him.

      • Alto@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        11 months ago

        The guy could be the king of England and I wouldn’t care. I don’t care what he has and has not hyped in the past. Believing a single word that comes out of a CEOs mouth about a product without any proof is just foolish.

  • just_another_person@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    2
    ·
    edit-2
    11 months ago

    I wouldn’t be mad about an Intel comeback. It fosters competition, and puts the US back on the map for fab production. What I’m suspect about is how this same team has managed to do this after flubbing and falling behind SO FAR in the past 5 years. It’s going to take 2 generations for them to catch up on TDW to AMD, and the fact that they couldn’t even release a half-decent graphics platform with ARC really makes me suspect about these claims. Their products are terrible, they’re getting slammed on all fronts from every other chip manufacturer, and they can’t seem to get a solid win in any major datacenter deals since 2020. We’ll see what happens, but until things start rolling out of these processes…eh.

    • Tetsuo@jlai.lu
      link
      fedilink
      English
      arrow-up
      12
      ·
      11 months ago

      I think ARC is a good thing. It’s pretty bad performance wise but it’s a new manufacturer in this field. There may be more competition for AMD and Nvidia which is always good for the consumer.

      I suppose it is very hard to enter that market at this point and they still did it.

    • Joker@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      I’m pumped about a comeback and it’s what I’ve been hoping for since Gelsinger came back. I’ve always been more Team Red, but a strong Intel is good for consumers, the industry and the USA. Last time they were down for any extended period, they rolled out the Core 2 Duo and had some really great stuff for a number of years after. I want to be blown away by a new chip like we were back then.

    • ColeSloth@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      I agree. Intel is floundering on almost every front right now. Half serves em right for price gouging processors for over a decade, but both amd and nvidia need some competition.

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    edit-2
    11 months ago

    Intel has always been able to come back when the competition surpassed them.
    But honestly, this time I was very skeptical they would make it. More than 5 years fumbling the ball in several ways, after Itanium they failed in their production process, and they failed on core design against AMD. Resulting in the first period where Intel was pressed financially and actually had deficits.
    They’d also failed on GPU and their Ray-tracing design, that was to compete on AI too. None of that worked at all against way better competing products. And when their products began to fall behind against AMD on servers, it seemed like the ship had sailed.
    But it seems Intel is clawing their way back again, as they’ve managed so many times before.
    And I’ve never cheered it as much as I do now. TSMC was on the way to monopolize high end chip manufacturing, and in the long run, that is very unhealthy for everybody involved.

    • the_q@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      11 months ago

      They haven’t made a comeback yet. This article is just pr to make sure investors don’t leave just yet.

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        11 months ago

        Yes, I suppose you are right, this is a year into the future, and we’ve seen promises before, where it didn’t go quite as planned.
        Still it looks like they at least are catching up.

    • Vik@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      11 months ago

      A new start in dGPU is no easy task, but I honestly thought Arc’s relative RTRT and compute perf were quite good?

      My main complaint would be their Linux support situation for Arc. I’m hoping it will improve over time.

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        11 months ago

        I do not count Arc as decidedly a failure (yet), allegedly the cards are pretty good with the new drivers. It was previous attempts of making cards that supposedly could run pure raytracing, that were supposed to compete with Nvidia in datacenters. But to be fair, 5 years of fumbling is some years ago, I was talking more 5 to 10 years back, where it appeared Intel failed with just about everything.

        But Itanium their new server 64 bit CPU is way longer ago, and so is the GPU I think it was Knights Ferry, complete failure with twice the energy consumption and half the performance of Nvidia. Only later production began to fail too, and Intel Core2 was shortly beat by Ryzen on all parameters, and of course Optane failed too.

        • hips_and_nips@lemmy.world
          cake
          link
          fedilink
          English
          arrow-up
          3
          ·
          11 months ago

          Itanium

          Now there’s a name I haven’t heard in a long time.

          And the Xeon Phi (Knight’s Ferry/Landing) was in the GPU space, but only in GPGPU. The idea was that the Xeon Phi, with an x86-compatible core, could, with less modification, run software that was originally targeted to a standard x86 CPU. Something like 68-70 x86-64 cores.

          I had a couple of them when I was taking parallel programming back in the day. Nifty little devices, but largely outshined by distributed multiprocessing for x86-64 and paled in comparison to the power of CUDA. That might be my own bias talking though.

          • Buffalox@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            11 months ago

            paled in comparison to the power of CUDA. That might be my own bias talking though.

            I don’t have personal experience, but AFAIK that’s what everybody says. They were marketed as compute units, but their compute performance was very poor compared to the competition.

      • Vik@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        11 months ago

        In terms of technology & product offerings (perf/watt, compute density, TCO) relative to AMD, then Intel have absolutely fallen behind.

        Though, this story has taken time to reflect in server market share, and Intel are still the major player.

      • ColeSloth@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        More like quickly losing ground. AMD has been taking server side from Intel over the past like 8 years. Intel still has more out there, but every year AMD has been gaining share.

        In 2016 amd server share was like 1%. Now it’s at 17% and climbing every year.

  • Nighed@sffa.community
    link
    fedilink
    English
    arrow-up
    5
    ·
    11 months ago

    Their problem has always been with yield not the node size right? They could make the smaller nodes, just not cost effectively?

    • just_another_person@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      11 months ago

      They’ve had great success with different generations in both. I can’t seem to find the articles, but their biggest win on yield came out of their fab in Colorado over a decade ago. Which they essentially sold for scrap after it became irrelevant.

  • BudgieMania@kbin.social
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    11 months ago

    Gelsinger was clearly waiting until I sold my shares for a measly profit before revealing his memes were not memes -_-

    but seriously now this is really good news we need as many options for stupidly advanced nodes as we can get, we got a taste of what happens when there’s only one great choice in the block during covid and I don’t wanna see that ever again

    • ColeSloth@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Sure, but with Intels constant stream of failures over the past like 8 years, I’m not believing any “great things are on the horizon” coming from Intel. They’re failing the home market, video card market, and server market right now. They need something.