Skip to content

SSH fleets

Supporting Intel Gaudi AI accelerators with SSH fleets

At dstack, our goal is to make AI container orchestration simpler and fully vendor-agnostic. That’s why we support not just leading cloud providers and on-prem environments but also a wide range of accelerators.

With our latest release, we’re adding support for Intel Gaudi AI Accelerator and launching a new partnership with Intel.

Introducing GPU blocks and proxy jump for SSH fleets

Recent breakthroughs in open-source AI have made AI infrastructure accessible beyond public clouds, driving demand for running AI workloads in on-premises data centers and private clouds. This shift offers organizations both high-performant clusters and flexibility and control.

However, Kubernetes, while a popular choice for traditional deployments, is often too complex and low-level to address the needs of AI teams.

Originally, dstack was focused on public clouds. With the new release, dstack extends support to data centers and private clouds, offering a simpler, AI-native solution that replaces Kubernetes and Slurm.

Beyond Kubernetes: 2024 recap and what's ahead for AI infra

At dstack, we aim to simplify AI model development, training, and deployment of AI models by offering an alternative to the complex Kubernetes ecosystem. Our goal is to enable seamless AI infrastructure management across any cloud or hardware vendor.

As 2024 comes to a close, we reflect on the milestones we've achieved and look ahead to the next steps.