AI container orchestration
for everyone

dstack is a lightweight alternative to Kubernetes designed to streamline the development, training, and deployment of AI models in the cloud and on-prem, with support for NVIDIA, TPU, and AMD.

Dev environments

Before scheduling a task or deploying a model, you may want to run code interactively.

Dev environments allow you to provision a remote machine set up with your code and favorite IDE with just one command.

Learn more

Tasks

A task allows you to schedule a job or run a web app. It lets you configure dependencies, resources, ports, and more. Tasks can be distributed and run on clusters.

Tasks are ideal for training and fine-tuning jobs or running apps for development purposes.

Learn more

Services

A service allows you to deploy a web app or a model as a scalable endpoint. It lets you configure dependencies, resources, authorizarion, auto-scaling rules, etc.

Learn more

Fleets

Fleets enable efficient provisioning and management of clusters and instances, both in the cloud and gem.

Once a fleet is created, it can be reused by dev environments, tasks, and services.

Learn more

Why ML engineers dstack

Andrew Spott

ML Engineer at Stealth Startup

Thanks to @dstack, I get the convenience of having a personal Slurm cluster and using budget-friendly cloud GPUs, without paying the super-high premiums charged by the big three.

Alvaro Bartolome

ML Engineer at Argilla

With @dstack it's incredibly easy to define a configuration within a repository and run it without worrying about GPU availability. It lets you focus on data and your research.

Park Chansung

ML Researcher at ETRI

Thanks to @dstack, I can effortlessly access the top GPU options across different clouds, saving me time and money while pushing my AI work forward.

Eckart Burgwedel

CEO at Uberchord

With @dstack, running LLMs on a cloud GPU is as easy as running a local Docker container. It combines the ease of Docker with the auto-scaling capabilities of K8S.

Peter Hill

Co-Founder at CUDO Compute

@dstack simplifies infrastructure provisioning and AI development. If your team is on the lookout for an AI platform, I wholeheartedly recommend @dstack.

Get started in a minute

Open-source
Self-hosted
Use your own cloud accounts or on-prem servers.
AWS
Azure
GCP
Lambda
RunPod
Vast.ai
TensorDock
CUDO
OCI
K8S
SSH
Your cloud accounts
Your on-prem servers
CLI & API
Install open-source
Always free.
dstack Sky
Hosted by dstack
Access GPUs at the best possible rate.
Marketplace
GPU marketplace
Your cloud accounts
Your on-prem servers
CLI & API
Sign up now
Pay per compute.

FAQ

What is dstack?

dstack is an open-source orchestration engine that simplifies developing, training, and deploying AI models, as well as managing clusters on any cloud or data center. It provides a unified interface to manage AI model development at any scale, whether in the cloud or on-premises.

With dstack, you can utilize various cloud providers or on-prem infrastructure, along with any hardware, and leverage open-source frameworks and tools for both training and deployment.

How does it compare to Kubernetes?

Both dstack and Kubernetes are container orchestration engines that can be used with cloud and on-prem. The difference is that dstack is much more lightweight and tailored to simplify AI development.

First of all, dstack offers an interface tailored for AI, allowing any AI engineer to use it out of the box for development, training, and deployment without needing additional tools or help from the Ops team.

dstack, out of the box, supports multiple cloud providers and offers all the features of a managed version of Kubernetes. It's very easy to integrate dstack with new cloud providers.

dstack is much easier to use for running containers on on-prem servers. If you have a cluster of on-prem servers, you just need to provide dstack with their hostnames and SSH credentials. dstack will automatically add them as a fleet that you can reuse to run containers.

Can I use dstack with Kubernetes?

If you already use Kubernetes, you can set up the dstack server to run containers via Kubernetes. In that case, dstack provides your AI engineers with a simple interface for running dev environments, tasks, and services without involving your Ops team or using any other tools.

Unless you're required to use Kubernetes, it's generally recommended to use dstack without Kubernetes and set up cloud accounts and on-prem servers directly via dstack's backends and fleets. In that case, the provisioning of instances will be more efficient and convenient.

What is dstack Sky?

If you don't want to host the dstack server yourself or would like to access GPU from the dstack's marketplace, sign up with dstack Sky.

Unlike the open-source dstack where you have to set up and manage the server yourself, dstack Sky takes care of hosting the server on your behalf. Most importantly, dstack Sky offers access to cloud GPUs through its marketplace, ensuring you get them at competitive rates.

If needed, you can still configure dstack Sky to use your own cloud accounts or connect to your on-prem servers.

Have more questions, want a demo, or need help?
Join Discord