• MNByChoice@midwest.social
    link
    fedilink
    arrow-up
    4
    ·
    7 hours ago

    A lot of the need is due to the heat density of the GPUs used for GenAI. Could they build less densely? Yes, and they likely already are but need to go further. I have seen data centers with racks less than half (I think it was closer to one quarter) populated for energy density issues.

    Could they use sea water? Sea water causes more corrosion. (I am uncertain if this data center is close to the ocean.)

      • MNByChoice@midwest.social
        link
        fedilink
        arrow-up
        1
        ·
        5 hours ago

        that can also do GenAI work for a similar “hardware cost per output”? No

        FYI, the server hosts for the cards often have eight of the cards each. The power draw becomes the host server’s RAM and CPU, plus eight times 750w (or whatever). It scales up quickly.

        • Harbinger01173430@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          5 hours ago

          Seems like optimization issue.

          If they can’t train and run a big ass ai model on igpu power at very fast speeds then they are useless as developer companies.

          So much bloat.