2. $s3cmd --configure
Setup Access and Security Keys.
Although the Test step may fail. ERROR: Test failed: 403 (AccessDenied): Access Denied
But the configuration actually works.
3. $s3cmd ls s3://bucketname/folder/
4. $s3cmd get s3://bucketname/folder/file.txt
5. Disk usage by buckets
s3cmd du [s3://BUCKET[/PREFIX]]
Return Byte
Below shell script will convert it to MB and GB.
#!/bin/bash if [ "${1}" ] then NUM=0 COUNT=0 for N in `s3ls ${1} | awk ‘{print $11}’ | grep [0-9]` do NUM=`expr $NUM + $N` ((COUNT++)) done KB=`expr ${NUM} / 1024` MB=`expr ${NUM} / 1048576` GB=`expr ${NUM} / 1073741824` echo “${COUNT} files in bucket ${1}” echo “${NUM} B” echo “${KB} KB” echo “${MB} MB” echo “${GB} GB” else echo “Usage : ${0} s3-bucket” exit 1 fi
Reference:
http://s3tools.org/s3cmd-howto
http://www.jonzobrist.com/2011/03/19/s3-du-sh-script-to-get-bucket-size-on-amazon-aws-s3/
No comments:
Post a Comment