Intel, Google, Microsoft, Meta and different tech heavyweights are establishing a brand new trade group, the Extremely Accelerator Hyperlink (UALink) Promoter Group, to information the event of the elements that hyperlink collectively AI accelerator chips in information facilities.
Introduced Thursday, the UALink Promoter Group — which additionally counts AMD (however not Arm simply but), Hewlett Packard Enterprise, Broadcom and Cisco amongst its members — is proposing a brand new trade customary to attach the AI accelerator chips discovered inside a rising variety of servers. Broadly outlined, AI accelerators are chips starting from GPUs to custom-designed options to hurry up the coaching, fine-tuning and operating of AI fashions.
“The trade wants an open customary that may be moved ahead in a short time, in an open [format] that enables a number of corporations so as to add worth to the general ecosystem,” Forrest Norrod, AMD’s GM of information heart options, informed reporters in a briefing Wednesday. “The trade wants an ordinary that enables innovation to proceed at a fast clip unfettered by any single firm.”
Model one of many proposed customary, UALink 1.0, will join as much as 1,024 AI accelerators — GPUs solely — throughout a single computing “pod.” (The group defines a pod as one or a number of racks in a server.) UALink 1.0, primarily based on “open requirements” together with AMD’s Infinity Material, will permit for direct masses and shops between the reminiscence connected to AI accelerators, and usually increase pace whereas decreasing information switch latency in comparison with present interconnect specs, in keeping with the UALink Promoter Group.
The group says it’ll create a consortium, the UALink Consortium, in Q3 to supervise growth of the UALink spec going ahead. UALink 1.0 can be made obtainable across the identical time to corporations that be a part of the consortium, with a higher-bandwidth up to date spec, UALink 1.1, set to reach in This autumn 2024.
The primary UALink merchandise will launch “within the subsequent couple of years,” Norrod stated.
Manifestly absent from the checklist of the group’s members is Nvidia, which is by far the biggest producer of AI accelerators with an estimated 80% to 95% of the market. Nvidia declined to remark for this story. But it surely’s not robust to see why the chipmaker isn’t enthusiastically throwing its weight behind UALink.
For one, Nvidia presents its personal proprietary interconnect tech for linking GPUs inside an information heart server. The corporate might be none too eager to help a spec primarily based on rival applied sciences.
Then there’s the truth that Nvidia’s working from a place of monumental energy and affect.
In Nvidia’s most up-to-date fiscal quarter (Q1 2025), the corporate’s information heart gross sales, which embrace gross sales of its AI chips, rose greater than 400% from the year-ago quarter. If Nvidia continues on its present trajectory, it’s set to surpass Apple because the world’s second-most precious agency someday this yr.
So, merely put, Nvidia doesn’t should play ball if it doesn’t need to.
As for Amazon Internet Companies (AWS), the lone public cloud big not contributing to UALink, it is perhaps in a “wait and see” mode because it chips (no pun supposed) away at its numerous in-house accelerator {hardware} efforts. It may be that AWS, with a stranglehold on the cloud companies market, doesn’t see a lot of a strategic level in opposing Nvidia, which provides a lot of the GPUs it serves to clients.
AWS didn’t reply to cryptonoiz’s request for remark.
Certainly, the largest beneficiaries of UALink — in addition to AMD and Intel — appear to be Microsoft, Meta and Google, which mixed have spent billions of {dollars} on Nvidia GPUs to energy their clouds and prepare their ever-growing AI fashions. All need to wean themselves off of a vendor they see as worrisomely dominant within the AI {hardware} ecosystem.
In a current report, Gartner estimates that the worth of AI accelerators utilized in servers will complete $21 billion this yr, rising to $33 billion by 2028. Income from AI chips will hit $33.4 billion by 2025, in the meantime, initiatives Gartner.
Google has {custom} chips for coaching and operating AI fashions, TPUs and Axion. Amazon has a number of AI chip households underneath its belt. Microsoft final yr jumped into the fray with Maia and Cobalt. And Meta is refining its personal lineup of accelerators.
Elsewhere, Microsoft and its shut collaborator, OpenAI, reportedly plan to spend at the very least $100 billion on a supercomputer for coaching AI fashions that’ll be outfitted with future variations of Cobalt and Maia chips. These chips will want one thing to hyperlink them — and maybe it’ll be UALink.