Posts: 14,704
Threads: 9,636
Thanks Received: 9,083 in 7,233 posts
Thanks Given: 9,884
Joined: 12 September 18
20 March 25, 10:19
Quote:Next up in AI-GPUs: HBM4 in 2025, HBM4e in 2027
![[Image: HBM4-COMPUTERBASE-1200x624.jpg]](https://cdn.videocardz.com/1/2025/03/HBM4-COMPUTERBASE-1200x624.jpg)
The data-center GPU market is now moving away from HBM3 memory in favor of faster HBM3e technology. It will enable higher density per layer and larger capacity per stack. Each HBM module will also run at a higher speed than before.
However, the AI-acceleration needs of memory and companies like NVIDIA are among the fastest adopters of new High Bandwidth Memory variants. At GTC 2025, NVIDIA’s CEO presented a product roadmap confirming next-gen GPUs using HBM4 or even HBM4e technology, none of which is commercially available yet.
At GTC 2025, the largest memory makers were also showcasing samples of upcoming HBM variants. All of the major players in this market have showcased their next-gen HBM4 memory: SK Hynix, Samsung, and Micron. The HBM4 memory will enable 24 GB per 8-high stack, 32 GB per 12-high stack, and 48 GB capacity per 16-high stacks, with speeds up to 9.2 Gbps (initially around 8 Gbps). Volume production should start in 2026.
Continue Reading...