Skip to content


dstack 0.12.3: integration

dstack simplifies gen AI model development and deployment through its developer-friendly CLI and API. It eliminates cloud infrastructure hassles while supporting top cloud providers (such as AWS, GCP, Azure, among others).

While dstack streamlines infrastructure challenges, GPU costs can still hinder development. To address this, we've integrated dstack with, a marketplace providing GPUs from independent hosts at notably lower prices compared to other providers.

dstack 0.12.2: TensorDock integration

At dstack, we remain committed to our mission of building the most convenient tool for orchestrating generative AI workloads in the cloud. In today's release, we have added support for TensorDock, making it easier for you to leverage cloud GPUs at highly competitive prices.

dstack 0.12.0: Simplified cloud setup, and refined API

For the past six weeks, we've been diligently overhauling dstack with the aim of significantly simplifying the process of configuring clouds and enhancing the functionality of the API. Please take note of the breaking changes, as they necessitate careful migration.

dstack 0.10.7: Services

Until now, dstack has supported dev-environment and task as configuration types. Even though task may be used for basic serving use cases, it lacks crucial serving features. With the new update, we introduce service, a dedicated configuration type for serving.

dstack 0.9.1: Azure integration

At dstack, our goal is to create a simple and unified interface for ML engineers to run dev environments, pipelines, and apps on any cloud. With the latest update, we take another significant step in this direction.

dstack 0.7.0: Introducing dstack server

Last October, we open-sourced the dstack CLI for defining ML workflows as code and running them easily on any cloud or locally. The tool abstracts ML engineers from vendor APIs and infrastructure, making it convenient to run scripts, development environments, and applications.