Using s3cmd To Manage Files on Amazon S3

Recently I moved some podcasts on to Amazon Simple Storage Service, or S3, which I know is great and easy to use, and I’ve used it with some wrappers, but never directly until now. It turns out, unsurprisingly, that S3 is great and easy to use :) I used s3cmd from s3tools – a collection of python scripts that made this really really easy. Even better, I’m an Ubuntu user so s3cmd is already packaged for me and I simply installed with:

sudo aptitude install s3cmd

Once installed, I found s3cmd --help was surprisingly helpful. To start with you need to set up an access key on AWS (Amazon Web Services) using your amazon user credentials, then supply this to s3cmd by using s3cmd --configure and following the prompts.

Working with Buckets

Plastic Buckets Findhorn ScotlandS3 storage works on “buckets” which seem to be like root directories for virtual hosts (hold the walrus jokes, please). These must have unique names across the whole of S3 so some organisation-specific prefixing may be needed here, but the command looks something like:

s3cmd mb 

The bucket name starts with s3:// to denote that it is accessed on S3.


To put files onto S3 there are two commands. For one file, you use s3cmd put which takes the source and target and copies the file accordingly. For more files, s3cmd has a really handy sync command which will accept a directory as the source argument and a bucket or path as the target, and literally sync the two. I found this very helpful as I had 40+ podcasts to move!

To see what is in a bucket use s3cmd ls and the name of the bucket. This lists all the files, and you can use the s3cmd info command if you want to know more about an individual file such as its size, modified date or permissions. I found it really easy to see what was in the bucket.


Since I’m only using S3 as a replacement for an uploads directory, all the files are publicly accessible. Amazon does provide a comprehensive ACL scheme but I didn’t use it so I won’t write about it this time. To make everything public, I simply did this:

s3cmd setacl --acl-public --recursive 

File URLs

Once the files are there and public, they are web accessible by replacing their s3://[bucket]/[filename] address with http://[bucket][filename]

4 thoughts on “Using s3cmd To Manage Files on Amazon S3

  1. I have few applications in AWS s3 Bucket. I need to grant “everybody” permission for most of the files I upload.

    Is it possible to check the file permissions through s3cmd?
    If so is it possible to check the file permissions and send out an email, if any file is found without its permission set to “everybody”?

  2. Hi there,

    Love your blog. I am trying to sync a web folder which needs authentication with S3 bucket. Do you know whether that is possible?


Leave a Reply

Please use [code] and [/code] around any source code you wish to share.

This site uses Akismet to reduce spam. Learn how your comment data is processed.