← Signal Board
GPU Profile80 GB VRAMData Center GPU

H100 PCIE

The NVIDIA H100 PCIE uses the Hopper architecture with 80 GB of HBM3 VRAM in a PCIe form factor.

Rent Lowest Price

Best live route via RunPod

Market Snapshot

Lowest
$1.9900/hr
Spread
$0.0000
Offers Live
1
Providers
1

Live Floor

$1.9900

RunPod

Average Price

$1.9900

Across 1 live offers

Reliability

90.0%

Based on 1 scored hosts

Last Day Delta

+$0.0000

Latest daily low: 2026-03-31

30-Day Price Trend

Daily Lowest Price

Available Offers

H100 PCIE from all providers

Sorted by cheapest hourly rate

ProviderPrice/hrVRAMGPUsReliabilityRegion
RunPod$1.990080 GB1x90.0%--View Deal

GPU Overview

About H100 PCIE

The NVIDIA H100 PCIE uses the Hopper architecture with 80 GB of HBM3 VRAM in a PCIe form factor. It provides datacenter-grade scalability with broader server compatibility, making it a practical choice for organizations integrating high-performance AI acceleration into existing infrastructure.

Its 80 GB of VRAM handles Large Language Models (LLM) and fine-tuning workloads across a wide range of model sizes. The H100 PCIE delivers uncompromising performance for enterprise-grade AI inference and training, with the flexibility to deploy in standard PCIe server configurations.

Key Features

  • VRAM80 GB
  • CategoryData Center GPU
  • Benchmark Score95

Best For

LLM InferenceAI TrainingEnterprise AIScalable DeploymentFine-TuningRendering

Provider Split

  • RunPod1

Recent Daily Lows

  • 2026-03-31$1.9900
  • 2026-03-30$1.9900
  • 2026-03-29$1.9900
  • 2026-03-28$1.9900
  • 2026-03-27$1.9900

Related GPUs

B200B100GB200H100 SXM5H100A100 SXM4