Site icon GIXtools

AMD buys Silo AI to counter Nvidia

AMD turned its AI acquisition dial another notch this week, announcing a deal to buy Europe’s largest private AI lab, Finland’s Silo AI, for $665 million in cash.

Last August it bought French AI inference startup Mipsology, with tiny open-source AI compiler outfit Nod.ai following in October.

But those were small, tactical acquisitions, part of what AMD described at the time as a $125 million investment in AI.

Ingesting Silo AI for several times that sum is a much bigger statement of intent in a sector that increasingly resembles a frenzied technological assembly line.

AI hardware is an anxious place to be right now unless you are Nvidia. The latter dominates the market for AI chips with traditional microprocessor brands such as AMD and Intel trailing in its wake.

Tiny missteps can make all the difference as rivals try to build a viable platform to take on Nvidia and, longer term, the emerging AI chip-to-platform threat from tech superpowers Google, Microsoft, Meta, and Amazon.

AI scramble

As it was in its decades-long microprocessor competition with Intel, AMD’s selling point is its skill in playing second fiddle to a larger rival that nobody wants to completely dominate the market.

Business abhors a true monopoly. AI is no different, which is why large tech platforms have carefully invested in their own AI chips even as they seem inclined to fill their datacenters with Nvidia hardware.

AMD finds itself in the same role once again in AI. Squeezed from both sides, the catch is that it has to demonstrate its competence. So far, it is keeping up. Its open-source ROCm tools, which rival Nvidia’s CUDA platform, have gained a modest following of sorts alongside advanced new hardware such as last December’s Instinct MI300 series datacenter GPUs.

However, judging by the most recent Top500 supercomputing list, Nvidia’s platform still has an almost unassailable position in high-end AI applications.

Why Silo AI?

The high price paid for Silo AI, founded in 2017, reflects a young company that still offers a formidable mixture of know-how and technical achievement.

That includes its work scaling LLMs on one of the world’s most powerful supercomputers, the 380 petaFLOP Finland-based LUMI, which uses AMD’s 3rd-Gen EPYC 64-core CPUs and more than 12,000 Instinct MI250X GPUs.

On top of that, there’s the company’s Silo OS, a 300-strong team of engineers and AI consultants, and an interesting customer book collected over more than 200 commercial projects.

The people side of the deal shouldn’t be underestimated. As with any emerging tech sector, the side battle is always about expertise and ideas. Engineers with practical knowledge of turning AI projects into something that can be put to business use are still in short supply globally. Getting hold of them now is what counts.

After that comes the price-performance battle which is why the longer-term predators will also include Google, Microsoft and Amazon. These hyperscalers are busy designing their own AI chips with the aim of selling AI inference power to their vast customer bases.

Leaving themselves dependent on Nvidia in AI would — at least in their view — be a version of the tail wagging the dog. The interesting question is whether Nvidia might want to join their ranks as a new type of AI-oriented hyperscaler at some point.

AMD said the Silo AI deal will close in the second half of 2024, with CEO and co-founder Peter Sarlin continuing to run the company’s AI team while reporting to AMD senior vice president Vamsi Boppana.

Source:: Network World

Exit mobile version