NVIDIA Blackwell B100 GPUs To Extra Than Double The Efficiency of Hopper H200 GPUs In 2024

NVIDIA has unveiled the primary efficiency teaser of its next-gen Blackwell B100 GPUs which is able to greater than double the efficiency of Hopper H200 in 2024.
NVIDIA Blackwell B100 AI GPUs To Supply Extra Than 2x Efficiency Versus Hopper H200 GPUs In 2024
Throughout its SC23 particular tackle, NVIDIA teased the efficiency of its next-gen GPUs codenamed Blackwell which is able to supply greater than 2x the AI efficiency of Hopper GPUs once they make their debut in 2024. The GPU used was the next-generation B100 which is able to succeed the Hopper H200 and will be seen simply crushing the GPT-3 175B inference efficiency benchmark, showcasing its large AI efficiency potential.
For the previous two years, NVIDIA has relied on its Hopper and Ampere GPUs to serve the wants of AI & HPC clients worldwide, collaborating with numerous companions, however all of that’s about to alter in 2024 with the arrival of Blackwell. NVIDIA noticed an enormous increase to its knowledge heart and general firm income because of the AI craze and it seems like that prepare goes full steam forward because the inexperienced workforce is aiming to launch two model new GPU households by 2025.
The primary of those new AI/HPC GPU households from NVIDIA goes to be Blackwell, named after David Harold Blackwell (1919-2010). The GPU would be the successor to the GH200 Hopper sequence & will use the B100 chip. The corporate plans on providing numerous merchandise together with GB200NVL (NVLINK), the usual GB200, and the B40 for visible compute acceleration. The following-gen lineup is predicted to be unveiled on the subsequent GTC (2024) adopted by a launch someday later in 2024.

Present rumors estimate that NVIDIA might be using the TSMC 3nm course of node for producing its Blackwell GPUs and the primary clients might be delivered the chips by the top of 2024 (This autumn) although the newest studies have highlighted that NVIDIA is fast-tracking the manufacturing to Q2 2024 which is identical time its not too long ago introduced Hopper H200 GPUs might be made accessible. Samsung is alleged to be a serious reminiscence supplier for NVIDIA’s next-gen Blackwell GPUs too.
The GPU can be anticipated to be the primary HPC/AI accelerator from NVIDIA to make the most of a chiplet design and might be competing with AMD’s Intuition MI300 accelerator which can be going to be an enormous deal inside the AI house because the crimson workforce has touted it to be.

The opposite chip that has been disclosed is the GX200 and this one is the follow-up to Blackwell with a launch scheduled for 2025. Now NVIDIA has been following a two-year cadence between its AI & HPC merchandise so it’s doubtless that we would solely see an announcement of the chip by 2025 with precise items to begin shipments by 2026.
The lineup might be primarily based on the X100 GPU and can embrace a GX200 lineup of merchandise and a separate X40 lineup for Enterprise clients. NVIDIA is understood to call its GPUs after well-known scientists and it already makes use of the Xavier codename for its Jetson sequence so we will anticipate a special scientist identify for the X100 sequence. In addition to that, there’s little that we all know in regards to the X100 GPUs however it’s significantly better than the Hopper-Subsequent codenames that NVIDIA is utilizing in prior roadmaps.
NVIDIA additionally plans to ship main “doubling” upgrades on its Quantum and Spectrum-X with new Bluefield and Spectrum merchandise, providing as much as 800 Gb/s switch speeds by 2024 and as much as 1600 Gb/s switch speeds by 2025. These new networking and interconnect interfaces will even assist the HPC / AI section so much in attaining the required efficiency.


NVIDIA Information Middle / AI GPU Roadmap
GPU Codename | X | Blackwell | Hopper | Ampere | Volta | Pascal |
---|---|---|---|---|---|---|
GPU Household | GX200 | GB200 | GH200/GH100 | GA100 | GV100 | GP100 |
GPU SKU | X100 | B100 | H100/H200 | A100 | V100 | P100 |
Reminiscence | HBM4? | HBM3e | HBM2e/HBM3/HBM3e | HBM2e | HBM2 | HBM2 |
Launch | 2025 | 2024 | 2022-2024 | 2020-2022 | 2018 | 2016 |