AMD to cut 4% of workforce to prioritize AI chip expansion and rival Nvidia

Advanced Micro Devices (AMD) is laying off 4% of its global workforce, around 1,000 employees, as it pivots resources to developing AI-focused chips. This marks a strategic shift by AMD to challenge Nvidia’s lead in the sector.

“As a part of aligning our resources with our largest growth opportunities, we are taking a number of targeted steps that will unfortunately result in reducing our global workforce by approximately 4%,” CRN reported quoting an AMD spokesperson.

“We are committed to treating impacted employees with respect and helping them through this transition,” the spokesperson further added. However, it remains unclear which departments will experience the majority of the layoffs.

A query to AMD seeking further clarification remains unanswered.

A surprise call

The latest layoffs were announced as AMD’s quarterly earnings reflected strong results – a strong increase in revenue as well as net profit.

This surprised many in the industry. Employees too demonstrated their shock in community chat platform Blind, where the information pertaining to the layoff came out first, which was later confirmed by the company.

However, on a deeper look, the Q3 results showed both strengths and challenges: while total revenue rose by 18% to $6.8 billion, gaming chip revenue plummeted 69% year-over-year, and embedded chip sales dropped 25%.

In its recent earnings call, AMD CEO Lisa Su underscored that the data center and AI business is now pivotal to the company’s future, expecting a 98% growth in this segment for 2024.

Su attributed the recent revenue gains to orders from clients like Microsoft and Meta, with the latter now adopting AMD’s MI300X GPUs for internal workloads.

However, unlike AMD’s relatively targeted job reductions, Intel recently implemented far larger cuts, eliminating approximately 15,000 positions amid its restructuring efforts.

Data center and AI drive growth

AMD has been growing rapidly through initiatives such as optimizing Instinct GPUs for AI workloads and meeting data center reliability standards, which led to a $500 million increase in the company’s 2024 Instinct sales forecast.

Major clients like Microsoft and Meta too expanded their use of MI300X GPUs, with Microsoft using them for Copilot services and Meta deploying them for Llama models. Public cloud providers, including Microsoft and Oracle Cloud, along with several AI startups, also adopted MI300X instances.

This highlights AMD’s intensified focus on AI, which has driven its R&D spending up nearly 9% in the third quarter. The increased investment supports the company’s efforts to scale production of its MI325X AI chips, which are expected to be released later this year.

Besides, AMD has recently introduced its first open-source large language models under the OLMo brand, targeting a stronger foothold in the competitive AI market to compete against industry leaders like Nvidia, Intel, and Qualcomm.

“AMD could very well build a great full-stack AI proposition with a play across hardware, LLM, and broader ecosystem layers, giving it a key differentiator among other major silicon vendors, said Suseel Menon, practice director at Everest Group. AMD is considered Nvidia’s closest competitor in the high-value chip market, powering advanced data centers that handle the extensive data needs of generative AI technologies.

Source:: Network World