Skip to content


Accessing the GPU marketplace with and dstack

With dstack 0.12.3, you can now use's GPU marketplace as a cloud provider.

dstack simplifies gen AI model development and deployment through its developer-friendly CLI and API. It eliminates cloud infrastructure hassles while supporting top cloud providers (such as AWS, GCP, Azure, among others).

While dstack streamlines infrastructure challenges, GPU costs can still hinder development. To address this, we've integrated dstack with, a marketplace providing GPUs from independent hosts at notably lower prices compared to other providers.

World's cheapest GPUs with TensorDock and dstack

With v0.12.2, you can now use cloud GPU at highly competitive pricing using TensorDock.

At dstack, we remain committed to our mission of building the most convenient tool for orchestrating generative AI workloads in the cloud. In today's release, we have added support for TensorDock, making it easier for you to leverage cloud GPUs at highly competitive prices.

Simplified cloud setup, and refined API

The v0.12.0 update makes it much easier to configure clouds and enhances the API.

For the past six weeks, we've been diligently overhauling dstack with the aim of significantly simplifying the process of configuring clouds and enhancing the functionality of the API. Please take note of the breaking changes, as they necessitate careful migration.

Multi-cloud and multi-region GPU workloads

The v0.11 update now automatically finds the cheapest GPU across clouds and regions.

The latest release of dstack enables the automatic discovery of the best GPU price and availability across multiple configured cloud providers and regions.

Introducing services to simplify deployment

The 0.10.7 update introduces services, a new configuration type for easier deployment.

Until now, dstack has supported dev-environment and task as configuration types. Even though task may be used for basic serving use cases, it lacks crucial serving features. With the new update, we introduce service, a dedicated configuration type for serving.

Lambda Cloud GA, and Docker support

The 0.10.5 release improves Lambda Cloud integration and adds support for Docker.

In the previous update, we added initial integration with Lambda Cloud. With today's release, this integration has significantly improved and finally goes generally available. Additionally, the latest release adds support for custom Docker images.

Say goodbye to managed notebooks

Why managed notebooks are losing ground to cloud dev environments.

Data science and ML tools have made significant advancements in recent years. This blog post aims to examine the advantages of cloud dev environments (CDE) for ML engineers and compare them with web-based managed notebooks.

Azure, better UI and more

The 0.9.1 update introduces Azure support among other improvements.

At dstack, our goal is to create a simple and unified interface for ML engineers to run dev environments, pipelines, and apps on any cloud. With the latest update, we take another significant step in this direction.

An early preview of dstack server

The 0.7 update introduces the server with UI, team management, and more.

Last October, we open-sourced the dstack CLI for defining ML workflows as code and running them easily on any cloud or locally. The tool abstracts ML engineers from vendor APIs and infrastructure, making it convenient to run scripts, development environments, and applications.

GCP support just landed

The 0.2 update adds support for Google Cloud Platform (GCP).

With the release of version 0.2 of dstack, it is now possible to configure GCP as a remote. All features that were previously available for AWS, except real-time artifacts, are now available for GCP as well.