Orchestrate GPU workloads effortlessly on any cloud

dstack is an open-source engine that streamlines the development, training, and deployment of AI models across any cloud provider.

Development

Experiment interactively in your IDE, terminal, or Jupyter notebooks before submitting long tasks or deploying models.

With dstack, a single command provisions the necessary cloud resources, code, and environment for your dev setup.

Learn more

Training

With dstack, running tasks such as training or fine-tuning scripts, or any other batch jobs, is incredibly easy.

Simply provide the commands, ports, and choose the Python version or a Docker image. dstack will handle the execution on configured cloud GPU provider(s) with the necessary resources.

Learn more

Deployment

With dstack, deploying models or any other web apps is straightforward.

Just provide commands, port, and select Python version or Docker image. dstack handles deployment on configured cloud GPU provider(s), giving you a public HTTPS endpoint.

Learn more

Get started in a minute

Open-source
Use dstack with your own cloud accounts and on-premises clusters.

Amazon Web Services
Azure
Google Cloud Platform
Lambda
TensorDock
Vast.ai
Kubernetes
Install open-source
100% open-source
dstack Sky
Get GPUs at the best prices and availability from a wide range of providers. No cloud account of your own is required.
Amazon Web Services
Azure
Google Cloud Platform
Lambda
TensorDock
Sign in with GitHub
Pay per compute. No commission.

FAQ

What is the difference between the open-source version and dstack Sky?

The open-source version allows you to run workloads using your own cloud accounts. It can be utilized via the CLI or API and enables the configuration of multiple projects and users.

dstack Sky is a fully managed service that enables you to run workloads across multiple cloud providers, guaranteeing optimal GPU pricing and availability. You don't need individual accounts with each provider – dstack Sky manages everything for you.

Community

Join Discord