Keep Your S3 Cloud Clean with Multi-Object Delete and Object Expiration
Join the DZone community and get the full member experience.Join For Free
A long time ago, in a 32-bit-world far, far away, we ran out of storage space quickly, and kept our closets clean as a result. Lots of the garbage really was garbage, too -- my small non-profit had absolutely no need for year-old log files, for example -- so nothing short of a legitimate hoarding neurosis could excuse these files' persistence.
Then storage became cheap -- then cloud storage became infinite, and why would you bother cleaning up if your storage space just kept growing? The cat hoarder isn't even crazy if the house is infinitely large: even a space filled with a million cats, if its volume is infinite, has zero density. Okay, maybe you pay a little extra, and maybe your performance suffers a little, but nothing crashes, and your performance is probably good enough, and is it really worth an extra half hour on a Friday afternoon..?..
Maybe. But two new (Dec2011) features from Amazon S3 give you that half hour too (and possibly way more):
- Multi-object delete: lets you delete a thousand objects at once, and/or object versions by Version Id. Great for catching up on old garbage, or clearing out superseded data, or tossing out an outdated project, etc. Learn more from the official announcement or developer guide.
- Object expiration: lets you define rules to schedule removal of objects from each S3 bucket (up to a hundred expiration rules). in other words: stops the garbage from building up in the first place. Read the announcement, or peruse the developer guide.
Opinions expressed by DZone contributors are their own.