
Today, we are excited to release the dstack Whitepaper — our blueprint for the next generation of confidential computing.
For the first time, developers can deploy Docker containers into Trusted Execution Environments (TEEs) in minutes, backed by a full Zero Trust architecture.
[👉 Read the full Whitepaper here]
The Problem: TEEs Are Powerful, But Painful

Trusted Execution Environments (TEEs) are a breakthrough for confidential computing: they isolate code and data at the hardware level, enabling privacy-preserving AI and secure multi-party collaboration.
But until now, TEEs have been hard to use and hard to trust:
- Vendor lock-in: Keys tied to single hardware vendors make workloads fragile.
- Incomplete verifiability: Remote Attestation proves identity, but not the full chain of trust.
- Centralized control: Domain and lifecycle management often depend on external authorities.
These gaps have kept TEEs from reaching their full potential in Web3, AI, and enterprise applications.
The dstack Breakthrough
dstack transforms raw TEE hardware into a true Zero Trust platform with three key innovations:
1. Portable Confidential Containers
Workloads can migrate across TEE vendors and hardware without breaking security.

- Blockchain-controlled key management (dstack-KMS)
- Forward and backward secrecy with key rotation
- Multi-party computation (MPC) for distributed trust
2. Decentralized Code Management
Application lifecycles are no longer dictated by admins — they are governed by smart contracts.

- Immutable audit trails of all updates
- Multi-sig and DAO-compatible governance models
- Secrets only released to authorized, verified code
3. Verifiable Domain Management
Every dstack app runs on standard HTTPS, but with TLS certificates generated inside TEEs.
- No central certificate authority dependency
- End-to-end Zero Trust TLS (ZT-TLS) protocol

- Seamless user experience in any browser
Together, these innovations turn confidential computing from a fragile niche into a developer-friendly, censorship-resistant, verifiable infrastructure.
Why Now? Private AI at Scale
The rise of AI has made data privacy, model protection, and verifiable execution urgent. Regulators, enterprises, and researchers all recognize the stakes: without trustless infrastructure, “private AI” is just a promise.
dstack provides the missing bridge — from Web2’s container-native workflows to Web3’s cryptographic guarantees. Docker simplicity meets Zero Trust security.
Inside the Whitepaper
The new dstack Whitepaper details:

- The architecture of dstack-OS, dstack-KMS, and dstack-Gateway
- A practical verification walkthrough for real-world applications
- Security analysis of TEE threats and mitigation strategies
- Broader implications for AI, enterprise computing, and beyond
We’ve also included detailed diagrams showing how containers move through the dstack runtime, how governance contracts enforce updates, and how ZT-TLS binds applications to their domains.
[👉 Read the full Whitepaper here]
What’s Next
We believe dstack is the missing layer for confidential AI, decentralized infrastructure, and Zero Trust cloud-native computing.
Here’s how you can get involved:
- Researchers: Review and stress-test our design.
- Developers: Explore our SDK (coming soon) to deploy Docker containers in TEEs instantly.
- Enterprises & VCs: Partner with us to bring dstack into production environments.
Call to Action
The future of trustless computing isn’t theoretical anymore. It’s dockerized, auditable, and on-chain.
👉 [ Read the full Whitepaper here]