Jump to content
  • Sign Up
×
×
  • Create New...

AI titans Microsoft and Nvidia reportedly had a standoff over Microsoft’s use of B200 AI GPUs in its own server rooms


Recommended Posts

  • Diamond Member



AI titans
This is the hidden content, please
and Nvidia reportedly had a standoff over
This is the hidden content, please
’s use of B200 AI GPUs in its own server rooms

Nvidia is well known for its high-performance gaming GPUs and industry-leading AI GPUs. However, the trillion-dollar GPU manufacturer is also known for controlling how its GPUs are used beyond company walls. For example, it can be quite restrictive with its AIB partners’ graphics card designs. Perhaps not surprisingly, this level of control also appears to extend beyond card partners over to AI customers, including

This is the hidden content, please
.
This is the hidden content, please
reports that there was a standoff between
This is the hidden content, please
and Nvidia over how Nvidia’s new Blackwell B200 GPUs were to be installed in
This is the hidden content, please
’s server rooms.

Nvidia has been aggressively pursuing ever larger pieces of the data center pie, which is immediately clear if you look at how it announced the Blackwell B200 parts. Multiple times during the presentation, Jensen Huang indicated that he doesn’t think about individual GPUs any more — he thinks of the entire NVL72 rack as a GPU. It’s a rather transparent effort to gain additional revenue from its AI offerings, and that extends to influencing how customers install their new B200 GPUs.

Previously, the customer was responsible for buying and building appropriate server racks to house the hardware. Now, Nvidia is pushing customers to buy individual racks and even entire SuperPods — all coming direct from Nvidia. Nvidia claims this will boost GPU performance, and there’s merit to such talk considering all the interlinks between the various GPUs, servers, racks, and even SuperPods. But there’s also a lot of dollar bills changing hands when you’re building data centers at scale.

Nvidia’s smaller customers might be ok with the company’s offerings, but

This is the hidden content, please
wasn’t. VP of Nvidia Andrew Bell reportedly asked
This is the hidden content, please
to buy a server rack design specifically for its new B200 GPUs that boasted a form factor a few inches different from
This is the hidden content, please
’s existing server racks that are actively used in its data centers.

This is the hidden content, please
pushed back on Nvidia’s recommendation, revealing that the new server racks would prevent
This is the hidden content, please
from easily switching between Nvidia’s AI GPUs and competing offerings such as AMD’s MI300X GPUs. Nvidia eventually backed down and allowed
This is the hidden content, please
to design its own custom server racks for its B200 AI GPUs, but it’s probably not the last such disagreement we’ll see between the two megacorps.

A dispute like this this is a sign of how large and valuable Nvidia has become over the span of just a year. Nvidia became the most valuable company earlier this week (briefly), and that title will likely change hands many times in the coming months. Server racks aren’t the only area Nvidia wants to control, as the tech giant also controls how much GPU inventory gets allocated to each customer to maintain demand, and it’s using its dominant position in the AI space to push its own software and networking systems to maintain its position as a market leader.

Nvidia is benefiting massively from the AI *****, which started when ChatGPT exploded in popularity one and a half years ago. Over that same timespan, Nvidia’s AI-focused GPUs have become the GPU manufacturer’s most in-demand and highest income generating products, leading to incredible financial success. Stock prices for Nvidia have soared and are currently over eight times higher than they were at the beginning of 2023, and over 2.5 times higher than at the start of 2024.

Nvidia continues to use all of this extra income to great effect. Its latest AI GPU, the Blackwell B200, will be the fastest graphics processing unit in the world for AI workloads. A “single” GB200 Superchip delivers up to a whopping 20 petaflops of compute performance (for sparse FP8) and is theoretically five times faster than its H200 predecessor. Of course GB200 is actually two B200 GPUs plus a Grace CPU all on a large daughterboard, and the B200 uses two large ***** linked together for good measure. The ‘regular’ B200 GPU on the other hand only offers up to 9 petaflops FP4 (with sparsity), but even so that’s a lot of number crunching prowess. It also delivers up to 2.25 petaflops of dense compute for FP16/BF16, which is what’s used in AI training workloads.

As of June 20th, 2024, Nvidia stock has dropped slightly from its $140.76 high and closed at $126.57. We’ll have to see what happens once Blackwell B200 begins shipping en masse. In the meantime, Nvidia continues to go all-in on AI.





This is the hidden content, please

#titans #

This is the hidden content, please
#Nvidia #reportedly #standoff #Microsofts #B200 #GPUs #server #rooms

This is the hidden content, please

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.