README 2.3 KB

123456789101112131415161718192021222324252627282930313233343536
  1. s3-multiputter.py
  2. Description:
  3. Reads a huge file (or device) and uploads to Amazon S3.
  4. Multiple workers are launched which read & send in parallel.
  5. Workers are allocated one chunk of the file at a time.
  6. Usage:
  7. s3-multiputter.py <BUCKET> <FILE> <THREADS> <CHUNKSIZE>
  8. BUCKET: The S3 bucket name to upload to
  9. FILE: The source file to upload
  10. THREADS: Number of parallel uploader threads
  11. CHUNKSIZE: Size (MB) of each chunk
  12. This script was designed for uploading very large block
  13. device images to S3. The problem it aims to solve is moving
  14. an EBS volume from one AWS region to another using S3 as the
  15. transport medium.
  16. Launch a hefty EC2 instance like a c1.xlarge, attach your EBS
  17. volume(s) to it, and use s3-multiputter.py to upload the block
  18. device to S3 -- it can read directly from /dev/sdxN devices.
  19. You can upload directly from an EC2 instance to the S3 service
  20. in another region, or you can upload from an EC2 instance to
  21. the local S3 service and use S3 copy to move your image file
  22. from one S3 region to another.
  23. Prerequisites:
  24. Boto library must be installed & configured with AWS creds
  25. http://code.google.com/p/boto/wiki/BotoConfig