Flash News

AMD CEO Lisa Su: AI Agents Drive Explosive Growth in CPU Demand

AMD CEO Lisa Su stated in a CNBC interview that AI Agents are driving enormous demand for computing power throughout the AI cycle, significantly raising growth expectations for the CPU market.

Businesses have recognized the real commercial value of AI, leading to a rapid increase in adoption rates. In addition to AI accelerators, the new generation of AI Agents requires substantial general computing power, resulting in strong CPU demand. The demand ratio of CPUs to GPUs has increased from the previous 1:4 or even 1:8 to nearly 1:1.

AMD has raised its expected compound annual growth rate for the addressable CPU market from 18-20% to over 35%, anticipating that the CPU market size will exceed $120 billion by 2030. Su emphasized that the practicality and commercialization of AI are progressing faster than expected six months ago.

Source: Public Information

ABAB AI Insight

Su has consistently emphasized heterogeneous computing with CPU+GPU since leading AMD to launch the MI series accelerators in 2022. This public adjustment of CPU market expectations continues her judgment on the transformation of AI Agents from "chat tools" to "autonomous working systems." Previously, AMD had targeted optimizations for AI inference and multi-agent coordination on EPYC processors.

On the capital path, AMD is shifting its data center revenue focus from pure GPUs to bundled CPU+GPU sales, expanding the total addressable market by increasing the share of general computing. This is also aimed at accelerating the server upgrade cycle for enterprises through the implementation of AI Agents, with a goal of achieving higher gross profit growth in the CPU business from 2026 to 2030.

Similar to Intel's Granite Rapids and NVIDIA's Grace CPU layouts, AMD is currently in an accelerated phase of transforming data center CPUs from traditional enterprise computing to a core support platform for AI Agents.

Essentially, this is a technological substitution: AI Agents, through tasks requiring substantial parallel general computing, are reconstructing CPUs from a peripheral role to core infrastructure, shifting capital from solely GPU acceleration to a nearly 1:1 balanced configuration of CPU+GPU. Mechanically, this is driven by the autonomous decision-making of Agents and long-term context maintenance, leading to an explosive demand for general computing power and transforming data centers from training-dominated to a mixed load of inference + Agents.

ABAB News · Cognitive Law

The smarter AI becomes, the closer the demand for general computing approaches that for dedicated accelerators. When the CPU:GPU ratio approaches 1:1, CPUs are no longer supporting roles but the main battlefield of the new cycle. What truly determines the speed of AI implementation is often the underestimated general computing power.

Source

·ABAB News
·
2 min read
·7d ago
分享: