Live Floor
$5.9800
RunPod
The NVIDIA B200 is built on the Blackwell architecture with 192 GB of HBM3e VRAM.
Best live route via RunPod
Live Floor
$5.9800
RunPod
Average Price
$5.9800
Across 1 live offers
Reliability
90.0%
Based on 1 scored hosts
Last Day Delta
+$0.0000
Latest daily low: 2026-03-31
30-Day Price Trend
Available Offers
Sorted by cheapest hourly rate
| Provider | Price/hr | VRAM | GPUs | Reliability | Region | |
|---|---|---|---|---|---|---|
| RunPod | $5.9800 | 180 GB | 1x | 90.0% | -- | View Deal |
GPU Overview
The NVIDIA B200 is built on the Blackwell architecture with 192 GB of HBM3e VRAM. Designed for datacenter-scale deployment, it delivers industry-leading memory capacity and compute throughput for scalable AI infrastructure and distributed training clusters.
With 192 GB of VRAM, the B200 handles Large Language Models (LLM) and fine-tuning of the largest foundation models without model parallelism constraints. Its benchmark score of 120 reflects uncompromising performance for enterprise-grade AI, making it a top-tier choice for organizations running production inference and large-scale training pipelines.