Orchestrate AI workloads in any cloud

dstack is an open-source orchestration engine for running AI workloads in any cloud or data center. Your models, your infrastructure.

Dev environments

Before scheduling a task or deploying a model, you may want to run code interactively.

Dev environments allow you to provision a remote machine set up with your code and favorite IDE with just one command.

Learn more

Tasks

Tasks allow for convenient scheduling of various batch jobs, such as training, fine-tuning, or data processing, as well as running web applications.

You can run tasks on a single machine or on a cluster of nodes.

Learn more

Services

Services make it very easy to deploy any kind of model as public, secure, and scalable endpoints.

Learn more

Pools

Pools enable the efficient reuse of cloud instances and on-premises servers across runs, simplifying their management.

Learn more

Get started in a minute

Open-source
Use your own cloud accounts or data centers.

AWS
Azure
GCP
Lambda
TensorDock
Vast.ai
CUDO
K8S
CLI & API
Self-hosted
Install open-source
Always free.
dstack Sky
Access GPUs from our marketplace at the best rate.
AWS
Azure
GCP
Lambda
TensorDock
Vast.ai
CUDO
CLI & API
Hosted by dstack
Web console
Sign up now
Pay per compute. No commision.
Enterprise
Use your own cloud accounts or data centers.

AWS
Azure
GCP
K8S
CLI & API
Self-hosted
Web console
Single sign-on
Audit logs
Book a demo
Contact us for pricing.

Why community dstack

Andrew Spott

ML Engineer at Stealth Startup

Thanks to @dstack, I get the convenience of having a personal Slurm cluster and using budget-friendly cloud GPUs, without paying the super-high premiums charged by the big three.

Alvaro Bartolome

ML Engineer at Argilla

With @dstack it's incredibly easy to define a configuration within a repository and run it without worrying about GPU availability. It lets you focus on data and your research.

Park Chansung

ML Researcher at ETRI

Thanks to @dstack, I can effortlessly access the top GPU options across different clouds, saving me time and money while pushing my AI work forward.

Eckart Burgwedel

CEO at Uberchord

With @dstack, running an open-source LLM or a dev environment on a cloud GPU is as easy as running a local Docker container. It combines the ease of Docker with the auto-scaling capabilities of K8s.

Peter Hill

Co-Founder at CUDO Compute

@dstack is instrumental in simplifying infrastructure provisioning and AI model development. if your organization is on the lookout for an platform to speed up the adoption of AI, I wholeheartedly recommend @dstack

Join Discord