Live Floor
$1.9900
RunPod
The NVIDIA H100 PCIE uses the Hopper architecture with 80 GB of HBM3 VRAM in a PCIe form factor.
Best live route via RunPod
Live Floor
$1.9900
RunPod
Average Price
$1.9900
Across 1 live offers
Reliability
90.0%
Based on 1 scored hosts
Last Day Delta
+$0.0000
Latest daily low: 2026-03-31
30-Day Price Trend
Available Offers
Sorted by cheapest hourly rate
| Provider | Price/hr | VRAM | GPUs | Reliability | Region | |
|---|---|---|---|---|---|---|
| RunPod | $1.9900 | 80 GB | 1x | 90.0% | -- | View Deal |
GPU Overview
The NVIDIA H100 PCIE uses the Hopper architecture with 80 GB of HBM3 VRAM in a PCIe form factor. It provides datacenter-grade scalability with broader server compatibility, making it a practical choice for organizations integrating high-performance AI acceleration into existing infrastructure.
Its 80 GB of VRAM handles Large Language Models (LLM) and fine-tuning workloads across a wide range of model sizes. The H100 PCIE delivers uncompromising performance for enterprise-grade AI inference and training, with the flexibility to deploy in standard PCIe server configurations.