AI Wave Takes Over High-Bandwidth Memory Chip Supply for 2024

The AI wave takes a toll on the chip supply.

High-bandwidth memory chips from SK Hynix and Micron are reportedly nearly sold out for the upcoming year as a direct result of the booming demand for artificial intelligence.

Two of the biggest memory chip producers in the world, SK Hynix and Micron, claim that they are almost out of high-bandwidth memory chips for 2024 and practically out of stock for 2025.

The market for high-end memory chips has expanded due to the demand for AI chipsets, greatly helping companies like Samsung Electronics and SK Hynix, the top two memory chip manufacturers globally.

Nvidia currently receives chips from SK Hynix, but the business also considers Samsung a possible source.

Compact Accelerator Tech Achieves Significant Energy Milestone, Holds Promise for Semiconductor Industry
Researchers from The University of Texas at Austin have unveiled a compact particle accelerator capable of producing an electron beam with an energy of 10 billion electron volts (10 GeV). Tom from Pixabay

(Photo: Tom from Pixabay) Researchers from The University of Texas at Austin have unveiled a compact particle accelerator capable of producing an electron beam with an energy of 10 billion electron volts (10 GeV).

High-performance memory chips are essential for training LLMs, like ChatGPT from OpenAI, which has caused the adoption of AI to soar. To respond to human-like questions, LLMs require these chips to retain information from previous interactions with users as well as their preferences.

SK Hynix plans to boost output to meet the spike in demand. This will be accomplished by making investments in the M15X fab in Cheongju, the Yongin semiconductor cluster in South Korea, and cutting-edge packaging facilities in Indiana.

According to CNBC, the need for AI chips is being fueled by major tech companies like Amazon, Google, and Microsoft, which are investing billions to train their LLMs to stay competitive.

AI Demand Constricts China's Water Supply

Previous reports have highlighted that, according to the non-profit organization China Water Risk, China's water supply could be strained due to the continuously increasing amounts of water required to operate artificial intelligence and data centers.

This growing demand for AI is expected to put a strain on a variety of resources.

The Hong Kong-based firm claims that China's data centers use over 1.3 billion cubic meters (343 billion gallons) of water yearly to meet the residential needs of 26 million people.

With more data centers expected to open by 2030, the volume might exceed 3 billion cubic meters.

The company projected that by the end of the decade, China would have produced over 11 million data center racks containing servers, cables, and other equipment.

Compared to the about 4 million in 2020, that is more than three times the number.

AI Demand Could Constrict Electricity Grid

Furthermore, as global cloud computing provider CoreWeave recently pointed out, AI's current "underestimated" demand for more computing capacity, data centers, and electricity could eventually cause stress and constraints on the world electricity grid.

CoreWeave co-founder Brian Venturo predicted this. He claims the requests CoreWeave receives on a regular basis from data centers are "absurd," with some requesting entire campuses.

The co-founder believes that supply networks that have historically supported very physical enterprises are not built to withstand the rate at which the market is evolving.

More "megacampuses," in his opinion, will strain electrical systems and heighten political controversies.

ChatGPT Privacy Guide: Here Are Some Tips to Protect Your Data in OpenAI's Chatbot
Here are some tricks that you can do to have more privacy when using OpenAI's ChatGPT. Tech Times

(Photo: Tech Times)

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics