Cleanup Large S3 Buckets
I found a neat python tool called s3wipe which brings significant speed improvements when deleting extremely large s3 buckets. It achieves this by using multiple threads and batch deletes. This really helped me out recently when deleting buckets containing several million objects and versions. Example Usage Empty a bucket of all objects, and delete the bucket when done. BUCKET_NAME=project-files-public docker run -it --rm slmingol/s3wipe \ --id ${AWS_ACCESS_KEY_ID} \ --key ${AWS_SECRET_ACCESS_KEY} \ --path "s3://${BUCKET_NAME}" \ --delbucket Remove all objects and versions with a certain prefix, but retain the bucket....