so it’s GANAM now (from GAFAM or GAMAM)

    • conciselyverbose@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      8 months ago

      LLMs are a bubble.

      But the uses of massively parallel math are still in their infancy. Scientific compute, machine learning, all kind of different simulations. Nvidia has been setting themselves up for all of it with cuda for years. At least until we get better options to physically replicate neurons (primarily how interconnected they are in a brain), GPUs and cuda specifically are how most AI is going to happen. And as the power increases, the ability to do increasing complex physics simulations of increasingly complex phenomena is going to become more and more relevant. Right now, it’s stuff like proton folding, fluid dynamics, whatever. But there’s way more coming. And all of it is going to use GPUs.

  • WallEx@feddit.de
    link
    fedilink
    English
    arrow-up
    7
    ·
    8 months ago

    Can infinite growth be real? Ask a supposed market expert and a kid, you will get different answers and only on is correct.

    • Telodzrum@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      Yeah, of course it is. It’s not a trend but an outlier, that odd how these things work.