Samsung Clears Major Hurdle with Fifth-Generation High Bandwidth Memory Chips, Poised to Supply Nvidia’s AI Processors by Q4 2024
In a significant stride for the world’s leading memory chipmaker, Samsung Electronics’ fifth-generation high bandwidth memory (HBM) chips, known as HBM3E, have successfully passed Nvidia’s stringent tests for use in its artificial intelligence (AI) processors, according to three informed sources. This development marks a crucial milestone for Samsung, which has been in a tight race with local competitor SK Hynix to supply advanced memory chips that are integral for handling generative AI tasks.
The qualification of the eight-layer HBM3E chips is a noteworthy achievement, enhancing Samsung’s position in the competitive landscape of memory technology. Although a supply agreement between Samsung and Nvidia has yet to be formalized, the sources anticipate that such a deal will be finalized soon, with shipments expected to commence by the fourth quarter of 2024.
While the eight-layer version has received Nvidia’s approval, Samsung’s 12-layer HBM3E chips are still undergoing evaluation and have not yet met the required standards. This aspect remains under wraps as the matter is confidential.
Samsung’s success in this qualification process underscores its capability to deliver cutting-edge memory solutions and strengthens its prospects in the AI processor market, setting the stage for significant advancements in AI technology.
.webp)



























.webp)