SK Hynix, an Nvidia supplier, expects high-bandwidth memory chips (HBM chips) used in AI chipsets to account for a double-digit percentage of its DRAM chip sales in 2024, according to CEO Kwak Noh-Jung.
South Korea’s SK Hynix (000660.KS), an Nvidia (NVDA.O.) supplier, anticipates high-bandwidth memory chips (HBM chips) used in AI chipsets to account for a double-digit percentage of its DRAM chip sales in 2024, CEO Kwak Noh-Jung said on Wednesday.
This month, the world’s second-largest memory chipmaker began mass manufacturing of next-generation advanced HBM chips, with sources indicating that initial shipments will go to Nvidia.
HBM chips are sophisticated memory chips in high demand for usage in Nvidia and other companies’ graphic processing units (GPUs), which process massive quantities of data in generative AI.
SK Hynix has led the HBM chip industry by being the exclusive provider of the current generation, the HBM3, to Nvidia, which accounts for 80% of the AI chip market.
Analysts predict HBM chips will account for 15% of industry-wide DRAM sales this year, up from 8% in 2023.
Also read: Nurturing Responsible Online Behavior in Students by Building a Culture of Digital Citizenship
Do Follow: CIO News LinkedIn Account | CIO News Facebook | CIO News Youtube | CIO News Twitter
About us:
CIO News, a proprietary of Mercadeo, produces award-winning content and resources for IT leaders across any industry through print articles and recorded video interviews on topics in the technology sector such as Digital Transformation, Artificial Intelligence (AI), Machine Learning (ML), Cloud, Robotics, Cyber-security, Data, Analytics, SOC, SASE, among other technology topics.