Spectrum-X

     By: Nick Gambino

AI is the flavor of the century and everyone is getting in on it and I mean everybody (tech companies mostly). Nvidia just announced a networking platform, Spectrum-X, that was designed and built from the bottom up to handle large AI workloads. It’s essentially an ethernet for AI.

“The delivery of end-to-end capabilities reduces run-times of massive transformer-based generative AI models,” Nvidia said in a press release. “This allows network engineers, AI data scientists and cloud service providers to improve results and make informed decisions faster.”

Spectrum-X is a backwards compatible platform that is made up of a high-powered Spectrum-4 Ethernet switch and a BlueField-3 data-processing unit (DPU). They work hand in hand to pull off their magic. In action, this duo of high-performance machinery can spit out 1.7 times AI performance as well as power efficiency.

If you’re into the nerdier details, here it is: the AI-focused Ethernet switch is capable of routing 128 ports of a 400 GB ethernet or 64 ports of a 800 GB ethernet. The make-up of the switch itself speaks to how much it can handle. It’s a large chip consisting of 100 billion transistors situated on a 90mm by 90mm die.

The idea here, as detailed by Jensen Huang (Nvidia CEO) at a keynote in Taipei on Monday, is to bring high-end computing power to the current ethernet market and allow data centers to become generative AI data centers without a complete overhaul.

The tech company is looking to solve problems of data traffic congestion now before it jams up. By spreading traffic out across the whole network we avoid pile-ups in AI clouds. The Ethernet switch apparently does not drop packets which they can boast as its best feature.

Technology is rushing by us and Nvidia seems to have their pulse on so much more than just gaming these days. There’s no putting the generative AI genie (should we just call it “Genie AI”?) back in the lamp.

AI is here to stay and Nvidia is acutely aware of that.