An easier way to train and deploy generative AI
dstack simplifies training, fine-tuning, and deploying generative AI models, leveraging the open-source ecosystem.
Want to quickly provision a dev environment for interactive development?
Specify a Docker image, your IDE, and required compute resources, then launch it via the CLI.
Run a custom training job or batch task with your code.
Specify Docker image (if any), commands, exposed ports, compute resources, and schedule via CLI or Python API.
Deploy your model using custom code.
Specify Docker image (if any), run commands, and allocate compute resources. Deploy as a public endpoint via CLI or Python API.