Thread Rating:
  • 1 Vote(s) - 5 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Nvidia Preps A100 GPU with 80GB of HBM2E Memory
#1
Information 
Quote:
[Image: vkoYB5GWjcGw7ARyRKLqSb-1024-80.png.webp]

80 Giga...what?

Nvidia recently added a yet-unannounced version of its A100 compute GPU with 80GB of HBM2E memory in a standard full-length, full-height (FLFH) card form-factor, meaning that this beastly GPU drops into a PCIe slot just like a 'regular' GPU. Given that Nvidia's compute GPUs like A100 and V100 are mainly aimed at servers in cloud data centers, Nvidia prioritizes the SXM versions (which mount on a motherboard) over regular PCIe versions. That doesn't mean the company doesn't have leading-edge GPUs in a regular PCIe card form-factor, though. 

Nvidia's A100-PCIe accelerator based on the GA100 GPU with 6912 CUDA cores and 80GB of HBM2E ECC memory (featuring 2TB/s of bandwidth) will have the same proficiencies as the company's A100-SXM4 accelerator with 80GB of memory, at least as far compute capabilities (version 8.0) and virtualization/instance capabilities (up to seven instances) are concerned. There will of course be differences as far as power limits are concerned. 

Nvidia has not officially introduced its A100-PCIe 80GB HBM2E compute card, but since it is listed in an official document found by VideoCardz, we can expect the company to launch it in the coming months. Since the A100-PCIe 80GB HBM2E compute card has not been launched yet, it's impossible to know the actual pricing. CDW's partners have A100 PCIe cards with 40GB of memory for $15,849 ~ $27,113 depending on an exact reseller, so it is pretty obvious that an 80GB version will cost more than that. 

Nvidia's proprietary SXM compute GPU form-factor has several advantages over regular PCIe cards. Nvidia's latest A100-SXM4 modules support a maximum thermal design power (TDP) of up to 400W (both for 40GB and 80GB versions) since it is easier to supply the necessary amount of power to such modules and it is easier to cool them down (for example, using a refrigerant cooling system in the latest DGX Station A100). In contrast, Nvidia's A100 PCIe cards are rated for up to 250W. Meanwhile, they can be used inside rack servers as well as in high-end workstations.

Nvidia's cloud datacenter customers seem to prefer SXM4 modules over cards. As a result, Nvidia first launched its A100-SXM4 40GB HBM2E module (with 1.6TB/s of bandwidth) last year and followed up with a PCIe card version several months after. The company also first introduced its A100-SXM4 80GB HBM2E module (with faster HBM2E) last November but only started shipping it fairly recently.
...
Continue Reading
[-] The following 1 user says Thank You to harlan4096 for this post:
  • silversurfer
Reply
#2
Additional Info:

https://www.anandtech.com/show/16792/nvi...-300-watts

https://www.tomshardware.com/news/nvidia...ting-boost
[-] The following 1 user says Thank You to harlan4096 for this post:
  • silversurfer
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)
[-]
Welcome
You have to register before you can post on our site.

Username/Email:


Password:





[-]
Recent Posts
Antivirus Removal Tool 2024.06 (v.1)
An updated version...harlan4096 — 09:52
Windows 11 File Explorer finally gets f...
You can try sfc and ...harlan4096 — 06:36
Brave 1.66.118
Release Channel 1....harlan4096 — 06:35
UALink standard announced: developed by ...
Ultra Accelerator ...harlan4096 — 06:34
Windows 11 File Explorer finally gets f...
Yes, I encountered t...jasonX — 01:49

[-]
Birthdays
Today's Birthdays
avatar (49)nteriageda
Upcoming Birthdays
avatar (46)BrantgoG
avatar (40)tapedDow
avatar (48)eapedDow
avatar (45)Carlosskake
avatar (47)rapedDow
avatar (42)Johnsonsyday
avatar (47)Groktus
avatar (39)efodo
avatar (37)Tedscolo
avatar (44)brakasig
avatar (43)JamesReshy
avatar (45)Francisemefe
avatar (38)leoniDup
avatar (37)Patrizaancem
avatar (49)smudloquask
avatar (44)benchJem
avatar (37)biobdam
avatar (40)zacforat
avatar (45)NemrokReks
avatar (48)Jasoncedia
avatar (36)Barrackleve
avatar (38)Julioagopy
avatar (48)aolaupitt2558
avatar (46)vadimTob
avatar (36)leannauu4
avatar (38)storoBox
avatar (46)kinotHeemn
avatar (37)Ceballos1976
avatar (38)efynu
avatar (30)horancos

[-]
Online Staff
There are no staff members currently online.

>