• @brucethemoose@lemmy.world
    link
    fedilink
    English
    5
    edit-2
    6 days ago

    The last part is absolutely false. The Nvidia H100 TDP is like 700W, though ostensibly configurable. The B200 is 1000W. The AMD MI300X is 750W.

    They also skimp on RAM with many SKUs so you have to buy the higher clocked ones.

    They run in insane power bands just to eek out a tiny bit more performance. If they ran at like a third of their power, I bet they would be at least twice as power efficient, and power use scales over nonlinearly with voltage/clock speed.

    But no, just pedal to the metal. Run the silicon as hard as it can, and screw power consumption.

    Other AI companies like Cerebras are much better, running at quite sane voltages. Ironically (or perhaps smartly), the Saudis invested in them.

    • @selfA
      link
      English
      246 days ago

      Other AI companies like Cerebras are much better, running at quite sane voltages. Ironically (or perhaps smartly), the Saudis invested in them.

      it’s real bizarre you edited this in after getting upvoted by a few people

      • David GerardOPMA
        link
        English
        185 days ago

        banned them for the subtle spam attempt

        • @selfA
          link
          English
          235 days ago

          “it’s just lemmy, my reddit shit will work fine there” says meme stock marketer who’s never been subjected to any kind of scrutiny

        • @selfA
          link
          English
          146 days ago

          do the results of your personal research frequently look like marketing horseshit?

          • @froztbyte
            link
            English
            8
            edit-2
            6 days ago

            (notably posted at a 7min delta after the other comment with oh so specific details, and just entirely dismissing the man behind the curtain as to the plurality of compute involved)

    • @froztbyte
      link
      English
      76 days ago

      wow, exemplary performance