As data footprints grow, businesses need cost-efficient storage for infrequently accessed data, high-performance file systems for collaborative work, and more aggressive data protection policies to meet strict recovery objectives. We’re introducing several significant enhancements to our storage portfolio to help you manage the challenges of data management, protection, and scaling.
TL;DR
Starting October 20th, Network file storage solution will be generally available, specifically designed for high-performance AI workloads. You can preview it in the DigitalOcean console after the release.
Spaces cold storage for infrequently accessed data is available for public preview. Visit our documentation to learn more and create a support ticket to request access to preview Spaces cold storage.
Usage-based backups is available for public preview to meet aggressive rpos. Check out our documentation to learn more. Request access to preview usage-based backups by submitting this form.
Data-intensive applications, particularly in AI and machine learning, require shared, high-performance file storage that is easy to provision and manage. Introducing our new Network File System (NFS) service, generally available in our ATL1 and NYC data centers starting October 20, 2025. This fully managed, high-performance solution is specifically designed to meet the demands of AI/ML startups and data-centric businesses by enabling concurrent shared dataset access for multi-node workloads.
lt offers key functionalities including share provisioning and support for NFSv3 and NFSv4 with POSIX compliance. It operates seamlessly within your Virtual Private Cloud, allowing a single share to be mounted across multiple GPU/CPU Droplets, and is optimized for high-throughput and low-latency, making it ideal for AI workloads.
The service also provides snapshots for point-in-time restores and offers allocation-based pricing with discounts for GPU-committed customers. Provisioning is simple, and the service is designed for the high-throughput, low-latency demands of model training and inference. Unlike some competitors that start at 1TB+ increments with high minimums and complex pricing, our solution offers a more cost-effective entry point with increments as small as 50 GiB.
Use cases:
AI: A data science team can use a single NFS share to manage and preprocess large datasets for model training across a cluster of GPU Droplets, shortening training times.
Media and content: A video production agency can use shared storage to allow multiple designers to work on the same project files without having to move data.
Benefits:
Simplified operations: Eliminate the overhead of self-managing a shared file system.
Performance optimized: Get high throughput and low latency tailored for AI/ML workloads.
Cost-effective scaling: Start small with 50 GiB increments and scale more affordably as your data grows.
Once released, you can learn more about Network file storage by visiting the product documentation page or preview it directly in the DigitalOcean console.
The rapid growth of AI has led digital-native enterprises to store vast quantities of data, much of which is rarely accessed. DigitalOcean’s Spaces cold storage is available for public preview to store such infrequently accessed objects at a price of $0.007/GiB per month. This includes retrieval of all cold data stored in a bucket up to once per month at no additional cost, after which retrieval overages are charged at $0.01 per GiB per month. This new cold storage bucket type provides a low-cost, S3-compatible solution for petabyte-scale datasets where data is accessed infrequently, needs to be retained for at least 30 days, and must be retrieved instantly.
To help you get started, all fees for Spaces cold storage beyond the $5 per month Spaces subscription will be waived until October 31, 2025. Pricing and capabilities are subject to change during general availability.
Use cases:
Backups and disaster recovery: Store secondary copies of data that are rarely accessed but must be available instantly.
Application logs and diagnostics: Keep data that must be occasionally retrieved for incident investigation, security events, or regulatory needs.
Archives: Store user-generated content, scientific data, and older project files.
AI/ML training and inference archives: Archive large, infrequently accessed datasets or model checkpoints that can be retrieved on-demand.
Benefits:
Cost-effective scaling: Store petabytes of data at a fraction of the cost of standard storage tiers.
Predictable pricing: Our simple pricing model includes one retrieval per month, up to your average storage usage, at no additional cost, and predictable, transparent pricing for additional retrievals so you can avoid the high, unpredictable fees of some other providers.
Instant retrieval: Access your data within seconds, even when it’s stored in a cold tier.
Visit our documentation to learn more and create a support ticket to request access to preview Spaces cold storage
Our new usage-based backup service, available now for public preview, helps users with strict Recovery Point Objectives (RPO) schedule backups every 4, 6, or 12 hours. Flexible retention policies for Droplets can be configured for 3 days to 6 months.
This feature is paired with a transparent, consumption-based billing model, charging only for the amount of restorable data used based on frequency.
Weekly: $0.04/GiB-Month
Daily: $0.03/GiB-Month
12 Hour: $0.02/GiB-Month
6 Hour: $0.015/GiB-Month
4 Hour: $0.01/GiB-Month
This means you only pay for what you actually use with no hidden fees for snapshot operations. This provides the flexibility to create a recovery plan that is both technically sound and financially viable, especially for high-change environments in regulated industries or for development environments. Pricing and capabilities are subject to change during general availability.
Use cases:
Compliance-driven organizations: Supports processing workloads that are subject to stringent compliance standards like HIPAA and SOC 2 by enabling frequent backups that can be stored for longer duration to provide a granular audit trail for sensitive data.
Gaming and AI startups: Protect rapidly changing user data and AI models on GPU Droplets with high-frequency backups, allowing for quick rollbacks in case of an issue.
Development environments: Shorten Recovery Point Objectives (RPOs) in Continuous Integration (CI)/Continuous Deployment (CD) pipelines and other dev workflows by using 4-6 hour backups to protect code and data changes.
SaaS environments: Safeguard rapidly changing user data in customer support platforms and SaaS tools by implementing more frequent, reliable data protection.
Benefits:
Enhanced data integrity: Restore from a more recent point in time, reducing data loss in the event of an incident.
Transparent billing: A clear, predictable cost model based on actual stored data.
Compliance-ready: Configure granular RPO and retention policies to enable meeting internal and external compliance standards.
Check out our documentation to learn more. Request access to preview usage-based backups by submitting this form.
By building these new capabilities, we are providing a more robust, flexible, and cost-effective infrastructure to help you address the challenges of scaling your business and keeping costs under control.
Ready to get started?
Try these features by heading to the DigitalOcean console.
Learn more by visiting our product documentation and regional availability.
Join our Deploy London session: How to Grow Without the Growing Pains, to get more details on these new product updates. You can either attend in-person or virtually.
Get expert guidance for free to strengthen your cloud architecture, optimize costs, scale your infrastructure, and improve backup and disaster recovery.
Building Products @ DigitalOcean