刘凡 9ff4d1d109 add S3,archive,truncate | 2 years ago | |
---|---|---|
.. | ||
Dockerfile | 2 years ago | |
README.md | 2 years ago | |
config.json | 2 years ago | |
license.md | 2 years ago | |
queue2blob.py | 2 years ago | |
requirements.txt | 2 years ago |
Polls an SQS queue for S3 Put/Post/Copy events on a publicly available bucket and replicates files listed in those events to Azure blob storage with an azure pull.
{
"Version": "2012-10-17",
"Id": "addmessage",
"Statement": [
{
"Sid": "sqsAllow",
"Effect": "Allow",
"Principal": {
"AWS": "*"
},
"Action": "SQS:SendMessage",
"Resource": "arn:aws:sqs:us-east-1:ACCOUNT_NUMBER:QUEUE_NAME",
"Condition": {
"ArnLike": {
"aws:SourceArn": "arn:aws:s3:*:*:NAME_OF_BUCKET"
}
}
}
]
}
Where ACCOUNT_NUMBER is you account ID, QUEUE_NAME is the name of your Queue, and NAME_OF_BUCKET is the bucket that is sending the s3 events.
python queue2blob.py --config-file location/of/config.json
{
"QUEUE":"QUEUE",
"REGION":"AWS REGION IF DEPLOYED IN AWS",
"S3REGION":"S3 REGION",
"STORAGE_ACCOUNT":"AZURE STORAGE ACCOUNT",
"STORAGE_KEY":"AZURE STORAGE KEY (OPTIONAL IF READ FROM ENVIRONMENT)",
"CONTAINER":"AZURE CONTAINER",
"PROFILE":"AWS CREDENTIAL PROFILE"
}
You can also specify parameters on the command line without a config file
--queue QUEUE_NAME Queue name
--region REGION AWS Region that the Queue is in
--s3region S3_REGION The region prefix for s3 downloads
--profile PROFILE The name of an aws cli profile to use.
--storage STORAGE_ACCOUNT
The name of storage account to use.
--key STORAGE_KEY The key for the storage account
--container CONTAINER
The container for the blob.
--debug Set debug flag
docker run -d --privileged=true \
-v /my/local/folder/config.json:/usr/src/app/config.json \
-e STORAGE_KEY=AZURE_STORAGE_KEY
--name s3queue2blob \
signiant/s3queue2blob