Press Release

CoreWeave Achieves New Record-Breaking AI Inferencing Benchmark with NVIDIA GB200 Grace Blackwell Superchips

No items found.
CoreWeave Achieves New Record-Breaking AI Inferencing Benchmark with NVIDIA GB200 Grace Blackwell Superchips

CoreWeave is the first cloud service provider to submit MLPerf Inference v5.0 results for NVIDIA GB200 Superchips

LIVINGSTON, N.J., April 2, 2025 /PRNewswire/ -- CoreWeave, the AI Hyperscaler™, today announced its MLPerf v5.0 results, setting a new industry benchmark in AI inference with NVIDIA GB200 Grace Blackwell Superchips. Using a CoreWeave instance with NVIDIA GB200, featuring two NVIDIA Grace CPUs and four NVIDIA Blackwell GPUs, CoreWeave delivered 800 tokens per second (TPS) on the Llama 3.1 405B model1—one of the largest open-source models.

"CoreWeave is committed to delivering cutting-edge infrastructure optimized for large-model inference through our purpose-built cloud platform," said Peter Salanki, Chief Technology Officer at CoreWeave. "These benchmark MLPerf results reinforce CoreWeave's position as a preferred cloud provider for leading AI labs and enterprises."

CoreWeave also submitted new results for NVIDIA H200 GPU instances. It achieved 33,000 TPS on the Llama 2 70B model, representing a 40 percent improvement in throughput over NVIDIA H100 instances.2

These results further demonstrate CoreWeave as an industry-leading cloud infrastructure services provider. This year, the company became the first to offer general availability of NVIDIA GB200 NVL72-based instances. Last year, the company was among the first to offer NVIDIA H100 and H200 GPUs, and it was one of the first to demo NVIDIA GB200 NVL72.

MLPerf Inference is an industry-standard suite for measuring machine learning performance across realistic deployment scenarios. How quickly systems can process inputs and produce results using a trained model has a direct impact on user experience.

About CoreWeave
CoreWeave, the AI Hyperscaler™, delivers a cloud platform of cutting-edge software powering the next wave of AI. The company's technology provides enterprises and leading AI labs with cloud solutions for accelerated computing. Since 2017, CoreWeave has operated a growing footprint of data centers across the US and Europe. CoreWeave was ranked as one of the TIME100 most influential companies and featured on Forbes Cloud 100 ranking in 2024. Learn more at www.coreweave.com.

Media Contact: Gurion Kastenberg, [email protected]

1Verified MLPerf® score of v5.1 Inference Closed Llama 3.1 405B offline. Retrieved from https://mlcommons.org/benchmarks/inference, 2 April 2025, entry 5.0-0076. The MLPerf name and logo are registered and unregistered trademarks of MLCommons Association in the United States and other countries. All rights reserved. Unauthorized use strictly prohibited. See www.mlcommons.org for more information.

2Verified MLPerf® score of v5.1 Inference Closed Llama 2 70B server. Retrieved from https://mlcommons.org/benchmarks/inference, 2 April 2025, entry 5.0-0077. The MLPerf name and logo are registered and unregistered trademarks of MLCommons Association in the United States and other countries. All rights reserved. Unauthorized use strictly prohibited. See www.mlcommons.org for more information.

SOURCE CoreWeave

Connect with us

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Related Blog Posts

No posts found.