Samsung Electronics HBM3

Samsung Electronics HBM3

Samsung’s High Bandwidth Memory 3 (HBM3) is the latest high-bandwidth memory technology for high-performance computing and AI applications. HBM3 stacks 12 layers of 10nm-class 16Gb DRAM dies to provide 24GB of memory capacity. This memory has approximately 1.5 times the performance and capacity of the previous generation. Data transfer speeds reach 6.4 Gbps, with a bandwidth of 819 GB/s, which is about 1.8 times faster than the previous generation.

The HBM3E is organized in a 12-stack structure and boasts a bandwidth of 1,280 GB/s and a capacity of 36 GB. This enables it to meet the higher capacity and performance demanded by AI service providers. Samsung has also applied advanced thermally compressed non-conductive film (TC NCF) technology to reduce the gap between chips and improve thermal properties.

The HBM3 uses a 1024-bit interface and supports 16 channels. This allows for package capacities of up to 64 GB. This high-bandwidth memory is especially useful for processing large amounts of data, such as AI workloads.

Samsung’s HBM3 and HBM3E are expected to play an important role in next-generation data centers and high-performance computing environments, and mass production is expected to begin in the first half of 2024.


Samsung’s High Bandwidth Memory 3 (HBM3) is a cutting-edge high-bandwidth memory technology for high-performance computing and AI applications. Let’s take a closer look at the key features and performance of HBM3.

Performance and capacity

  • Speed: HBM3 has data transfer speeds of up to 6.4 Gbps. This is about 1.8 times faster than its predecessor, HBM2E.
  • Bandwidth: HBM3 provides up to 819 GB/s of bandwidth. This allows it to excel in AI and high-performance computing jobs that require large amounts of data processing and fast computation.
  • Capacity: HBM3 stacks 12 layers of 10nm-class 16Gb DRAM dies to deliver 24GB of memory capacity. This is about 1.5 times more capacity than the previous generation.

Technical characteristics

  • Interface: HBM3 uses a 1024-bit interface and supports 16 channels. This is a doubling of the number of channels compared to HBM2E.
  • Structure: HBM3 utilizes advanced thermally compressed non-conductive film (TC NCF) technology to reduce the gap between chips and improve thermal properties. This allows the 12-layer structure to maintain the same height dimensions as the traditional 8-layer structure.

Power Efficiency

  • HBM3 is approximately 10% more power efficient than previous generations. This is designed to reduce the strain on servers while maintaining high performance.

Applications

  • HBM3 can be used in a variety of areas, including AI model training, data centers, and high-performance computing (HPC). In particular, it can speed up AI training by an average of 34% by allowing AI applications to process more data faster.

Looking to the future

  • Samsung is already working on the next generation of HBM memory, such as HBM3E. HBM3E offers 36GB of capacity and 1,280GB/s of bandwidth, and is targeted at AI and data center applications that require even higher performance and capacity.

Market Outlook

  • Samsung’s HBM3 is expected to enter mass production in the first half of 2024 and will be actively used in various industries to meet the demand for high-performance computing and AI applications.

Thanks to these performance and technical characteristics, Samsung’s HBM3 is expected to play an important role in the next generation of high-performance memory market.


Samsung and Nvidia collaboration

The collaboration between Samsung Electronics’ HBM3 memory and Nvidia is playing an important role in the advancement of high-performance computing and AI applications. The two companies are leveraging the full power of HBM3 to deliver high-performance memory solutions in conjunction with Nvidia’s GPUs.