Microsoft Azure Takes AI to New Heights with Nvidia's Groundbreaking Blackwell Systems
Microsoft Azure becomes the first cloud service provider to run the in-demand Blackwell systems of Nvidia, powered by GB200-powered AI servers. This is confirmed via a post on X, which also revealed that Microsoft is optimizing infrastructure for such advanced AI models using liquid cooling and Infiniband networking technology.
Although the company did not claim how many B200 processors are crammed into their server rack, it is expected to be around 32 units. This test facility is built to test both the Blackwell GPUs as well as the new cooling system before these products are more broadly rolled out in the market.
With the FP4 data format, Nvidia's B200 GPU posts an astonishing 9 PFLOPS and is 2.5 times more efficient than its predecessor, H100, in performance of FP8/INT8 (4,500 TFLOPS/TOPS vs. 1,980 TOPS). Such acceleration may significantly increase the scope of AI application and facilitate complex LLMs training.
Microsoft CEO Satya Nadella, said that “Microsoft has collaborated with Nvidia for a long time, and this is what is going to help them gain industry-leading AI workloads”. Further elucidations on Blackwell servers and AI works by Microsoft will be given at the MS Ignite conference happening between November 18–22, 2024, in Chicago. High-end servers are likely to see wide adoption late this year or early next year.
.webp)



























.webp)