Diamond Member Pelican Press 0 Posted July 1, 2024 Diamond Member Share Posted July 1, 2024 SK Hynix $75B investment in AI chips shows a growing trend As the demand for AI chips grows with the popularity of large language models, the demand for compute memory also grows. In response to that demand, SK Hynix, a South Korea-based company that supplies memory chips to AI vendors such as Nvidia, revealed plans to invest about $75 billion in AI chips through 2028. The development, on June 30, comes after South Korea said last month it will begin awarding eligible chipmakers financial support that will total about $19 billion. SK Hynix plans to invest about 80% of the $75 billion into high bandwidth memory (HBM) chips. High bandwidth memory chips consume less energy than other chips and allow different layers of AI chips to be stacked on top of each other. A demand for memory chips Other providers of memory chips, such as Micron and Samsung, as well as SK Hynix, have sold out of the chips for 2024 and 2025. The result is high demand for AI memory chips, especially as AI technology is increasingly integrated into different areas such as mobile devices and contact centers. “Basically, there’s going to be this continued growth and demand for this,” said Futurum Group analyst Dan Newman. “You’re seeing this massive frontload of demand for AI chips.” Despite tch giants such as This is the hidden content, please Sign In or Sign Up , Meta, and This is the hidden content, please Sign In or Sign Up buying thousands of GPUs from Nvidia, there is still a need for sufficient memory within AI systems, Newman added. “Everybody kind of talks about can they get enough Nvidia chips, but they also have to be able to get access to enough memory,” he said. “If we’re going to continue to train ******* models and continue to offer AI in more of our applications, we’re going to need to have the associated compute and memory access to do that.” The nearly unquenchable need for access to compute and memory is what is driving AK Hynix and competitors such as Samsung and Micron to seek to expand production of their AI memory chips. New production For instance, Micron is building HBM testing and mass production lines in Idaho. Micron also revealed that it received $6.1 billion from the U.S. CHIPS and Science Act. Samsung has also This is the hidden content, please Sign In or Sign Up to start construction of a new semiconductor plant, after delays. “There’s a war for memory chips because ultimately you need them with AI,” Constellation Research founder, This is the hidden content, please Sign In or Sign Up , said. “If you sell more AI, you need more HBM.” Ultimately the interest in memory chips shows that the rapid development of generative AI technology not only benefits vendors such as Nvidia, Intel and AMD, but also those supplying power and memory for the data center. “It’s a booming opportunity,” Newman said. “Right now, memory is one of the biggest booming cycles of all the chips.” Esther Ajao is a TechTarget Editorial news writer and podcast host covering artificial intelligence software and systems. This is the hidden content, please Sign In or Sign Up #Hynix #75B #investment #chips #shows #growing #trend This is the hidden content, please Sign In or Sign Up 0 Quote Link to comment https://hopzone.eu/forums/topic/55827-sk-hynix-75b-investment-in-ai-chips-shows-a-growing-trend/ Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.