Open Source

DGX Station is available (via OEM distributors)

The new workstation packs 8 Blackwell GPUs and 1.8TB of HBM3e memory for on-prem AI training.

Deep Dive

NVIDIA has officially launched the DGX Station GB300, positioning it as a 'personal AI supercomputer' for enterprise environments. The system is built around NVIDIA's new Blackwell architecture, specifically featuring eight B200 GPUs interconnected via the company's high-speed NVLink technology. This configuration delivers a staggering 1.8 terabytes of HBM3e memory, providing the massive bandwidth required for training and running the latest large language models (LLMs) and other complex AI workloads directly on-premises.

Unlike previous consumer-facing 'Founder's Edition' hardware, the DGX Station GB300 is being sold exclusively through NVIDIA's enterprise OEM distributor network, indicating its target audience is corporations and research institutions, not individual developers. The workstation form factor is a key selling point, designed to fit into a standard server rack or lab environment without requiring specialized data center cooling or power infrastructure. This makes cutting-edge AI compute more accessible for organizations that handle sensitive data bound by privacy regulations or that simply prefer to keep development and training cycles in-house.

The system is powered by dual Intel Xeon CPUs and includes NVIDIA's full AI enterprise software stack, including the CUDA platform and frameworks optimized for the Blackwell architecture. By offering this level of performance in a localized package, NVIDIA is catering to the growing demand for sovereign AI and secure, proprietary model development. The DGX Station represents a significant step in democratizing supercomputing-level resources, moving them from the cloud and centralized data centers directly to the desks of enterprise AI teams.

Key Points
  • Features 8 NVIDIA Blackwell B200 GPUs with 1.8TB of HBM3e memory for massive AI workloads.
  • Sold exclusively through enterprise OEM channels, highlighting its corporate and institutional target market.
  • Designed in a workstation form factor to simplify on-premises deployment for sensitive data and sovereign AI projects.

Why It Matters

It brings data-center-level AI training capability on-premises, crucial for industries with strict data privacy and security requirements.