• Lucy :3@feddit.org
    link
    fedilink
    arrow-up
    8
    ·
    edit-2
    1 day ago

    Afaik for consumers only the 5090 has 32GB VRAM. So you’re correct, practically impossible to find. And even if you find it, prone to spontaneous combustion.

    For servers, it tops out at 288GB currently, with the AMD Mi355X.

    • A7thStone@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      16 hours ago

      And they cost more than a high end PC. I’m not spending $3k on a card that can go up in smoke. Not to mention all of the honest reviewers I’ve seen say it’s performance improvements are all smoke and mirrors.

    • Anivia@feddit.org
      link
      fedilink
      arrow-up
      3
      ·
      1 day ago

      Afaik for consumers only the 5090 has 32GB VRAM

      Only if you don’t count Apple Silicon with its shared RAM/VRAM. Ironically a Mac Mini / Studio is currently the cheapest way to get a GPU with lots of vram for AI