AMD revealed a few more details regarding its upcoming line of artificial intelligence processors, in an effort to compete with its rivals and help data centers to handle the popularity of AI. This was teased earlier this year during the Consumer Electronic Show Convention 2023.
AMD's Bet for Generative AI
As OpenAI's ChatGPT becomes one of the most popular tools related to AI, a lot of people have been wondering if other GPUs will deliver the computational large memory requirements of LLMs other than NVIDIA's A100 and H100. According to a report from Bloomberg, Advanced Micro Devices Inc. will change that later this year.
AMD introduced the Instinct MI300 series earlier this year, and Chief Executive Officer Lisa Su unveiled more details regarding the GPU. The company stated during a presentation in San Francisco today, this will include an accelerator that has the capacity to speed processing for generative AI, which is similarly used by ChatGPT and other chatbots.
Instinct MI300X accelerator was also introduced by the company during the presentation, which is claimed as the world's most advanced accelerator for generative AI. This is based on the next generation of AMD CDNA 3 accelerator architecture and supports up to 192 GB of HBM 3 memory, aiming to provide the compute and memory efficiency needed for LLMs training.
Since it is equipped with a large memory, customers can now fit LLMs like Falcon-40, a 40B parameter model on a single, and MI300X accelerator5. Aside from this, the company also introduced the AMD Instinct Platform that brings MI300x into a design that is considered as the standard in the industry for solving AI inference and training.
Lastly, AMD Instinct MI300A was introduced as the first APU Accelerator for HPC and AI workloads in the world. It is currently sampling to customers while MI300X will start key sampling in the third quarter of the year.
Racing to Offer the World's First CPU, CPU Combo
As the chip industry becomes more challenging the market, AMD is racing to meet the increasing demand for AI computing as data centers are being pushed to the limit. Especially popular services that rely much on large language models and algorithms that take a massive amount of data to answer queries and generate images as fast as they can.
CEO Su added that AMD is still in the early life cycle of artificial intelligence but as for the preparation, it's going to be a lot. WCCF Tech reported that the total addressable market for data center AI accelerators will increase fivefold to more than $150 billion by 2027.
Along with AMD, Intel, and NVIDIA are currently racing to offer the first CPU and CPU combo in the world. Intel was supposed to offer Falcon Shores as its bet, which was going to combine an x86 CPU with a leading-edge GPU. Unfortunately, those plans did not come to fruition and faced delays in the project.
As per Forbes' report, Microsoft and OpenAI need to have an alternative to NVIDIA, which comes at perfect timing as they project AMD will give the company an offer that they cannot refuse knowing that MI300X looks to be a solid contender. Although this has not been confirmed, the report stated it is still possible to happen.