Edit: Fuck some of these comments Y’all sick in the head. I’m out.

  • Trainguyrom@reddthat.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 days ago

    …So whoever they hire as CEO now is probably there to just distribute golden parachutes and eat the company as it dies :/

    This is assuming there isn’t some gold in the pipeline. Timeline on a new CPU design is about 8 years from first drawings to actual silicon hitting media test benches, meaning whatever was started in 2019 and 2020 could be absolutely killer and just cooking to perfection in the R&D oven…assuming R&D was kept sufficiently funded and the engineering talent retained to see such a process through

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      11 days ago

      Their CPUs are mostly fine now. The “small” cores are very competitive in servers because they are so small for their perf, and TBH that desktop drama was marketing clocking the CPUs way too high to squeak out 4% more benchmark performance.

      It’s… everything else that’s the problem, as work is increasingly shifting away from CPUs. They are totally screwed if they cut funding for Arc, in particular, or if they don’t secure any real fab customers.

      • Trainguyrom@reddthat.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 days ago

        Oh yeah real talk they’ve got some killer datacenter chips, be that networking, CPU or GPU. They’re continuing to work on bleeding edge technologies for hyperscalers, and they’ve got no shortage of insane potential. But when they release two generations of desktop processors with hardware bugs it really puts a heck of a stain on such a stellar portfolio and makes it a lot easier for enterprises to look at AMD for their datacenter and client processors (especially when they’re absolutely killing it like they have been in both segments!)

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          10 days ago

          Eh Intel’s data center GPUs suck. Gaudi was okay, but gained no critical mass and is being phased out, Xe-HPC is being phased out and was hardly used anywhere, even Falcon Shores keeps getting delayed and looks to be in trouble going by statements about focusing on “consumer inference.” They seem dead in the water here, which is very worrying.

          The MI300X is actually good, but AMD totally blew it by ignoring a few glaring software issues and not seeding development with consumer GPUs, hence it’s not gaining much traction. The MI300A (the big APU) basically isn’t available or cost effective in any cloud instances.