November 2, 2022

SK Hynix begins mass production of HBM3 and ships it to Nvidia

SK Hynix was the first memory vendor to start talking about HBM3 and was the first company to fully develop memory under this specification. Today, the company said it has started mass-producing HBM3 memory and these DRAMs will be used by Nvidia for its H100 compute GPUs and DGX H100 systems that will ship in Q3.

SK Hynix’s well-known HBM 3 stack arrays (KGSD) offer a maximum memory bandwidth of 819 GB/s, which means they support data transfer rates of up to 6400 GT/s. As for capacity, each stack contains eight 2GB DRAM devices for a total of 16GB per package. SK Hynix also has 12-Hi 24GB KGSDs, but since Nvidia appears to be the company’s main customer for HBM3, the company is starting production with 8-Hi stacks.

The start of mass production of the HBM3 is good news for SK Hynix’s bottom line; for a while, at least, the company will be the only supplier of this type of memory and can charge a hefty premium for these devices. What is important for the public image of SK Hynix is ​​that it starts mass production of HBM3 before its big rival Samsung.

Eventually, SK Hynix and other memory makers will offer HBM3 packages with up to 16 x 32GB DRAM devices and with 64GB per KGSD capacities, but that’s a longer-term question.

Nvidia’s H100 compute GPU comes equipped with 96 GB of HBM3 DRAM, although due to ECC support and some other factors, users can access 80 GB of ECC-capable HBM3 memory attached using of a 5120-bit interface. To win the contract with Nvidia, SK Hynix worked closely with the company to ensure seamless interoperability between processor and memory devices.

“Our goal is to become a solution provider that deeply understands and responds to our customers’ needs through continuous open collaboration,” said Kevin (Jongwon) Noh, President and Chief Marketing Officer of SK Hynix.

But Nvidia won’t be the only company using HBM3 for the foreseeable future. SiFive registered its first system-on-chip supporting HBM3 on TSMC’s N5 node about a year ago, so the company could offer similar technology to its customers. Additionally, both Rambus and Synopsys have been offering proven HBM3 controllers and physical interfaces on silicon for some time and have landed many customers, so expect various SoCs that support HBM3 to arrive (mainly for AI and supercomputing applications) in the coming quarters.