I think cloud services are a great tool and can be really helpful depending on your needs. One of the first I heard about was S3 which allows us to store our data on the cloud. It doesn’t come with an application like Dropbox to sync a folder for you automatically. S3 actually works by means of an API (Application Programming Interface) that is available in a couple programming languages. Besides the API, there is a tool that allows us to interact with S3 and other cloud services from Amazon: the AWS Command Line Interface. In order to install it, open a terminal and type
pip install awscli
After installing it, you need to configure it with Amazon Credentials. For that,
you simply go to the AWS Website and look for Sign In to the Console
. Once you
are there, you click on top your name on the right top and then click on
Security Credentials
. After that, you have a couple options on the left menu
and click on Users
. I think it is better to create a user specific for a
certain task, this way you can avoid having bad news on your credit card later.
What I mean is that you can give only a specific set of permission for this
user. What I recommend doing is clicking on the user and and then going to
Permissions
. There you can Attach Policy
. The one we will need here is
AmazonS3FullAccess
. Once you create a new user, you have to store the
credentials. You will need those credentials for the following command:
$ aws configure
AWS Access Key ID [None]: accesskey
AWS Secret Access Key [None]: secretkey
Default region name [None]: us-east-1
Default output format [None]:
Now that you have configured the aws command line tools with your credential, it
will start bucketing down. The way to create a bucket is through the command
s3 mb
- translation: s3 make bucket:
$ aws s3 mb s3://mybucket
We can also specify the region if the default one is not what you want at the moment:
$ aws s3 mb s3://mybucket --region us-west-1
Here, I will create a bucket named ataias.code
in the region us-east-1
:
$ aws s3 mb s3://ataias.code --region us-east-1
Now it is time for the magic. I entered in a directory with some files and executed the following command on my terminal:
$ aws s3 sync . s3://ataias.code
upload: ./PlantOPC.py to s3://ataias.code/PlantOPC.py
upload: ./controleSmithPredictor.py to s3://ataias.code/controleSmithPredictor.py
upload: ./Controller.py to s3://ataias.code/Controller.py
upload: ./PlantOrig.py to s3://ataias.code/PlantOrig.py
upload: ./GenWindowsData.py to s3://ataias.code/GenWindowsData.py
upload: ./Model.py to s3://ataias.code/Model.py
upload: ./DelayBlock.py to s3://ataias.code/DelayBlock.py
upload: ./Plant.py to s3://ataias.code/Plant.py
As you can see, quite a couple files were uploaded. If you run the same command without changing the files, nothing is uploaded. If you create an empty folder and wants to sync the bucket to it, you could just do:
$ mkdir test
$ cd test
$ aws s3 sync s3://ataias.code .
download: s3://ataias.code/Controller.py to ./Controller.py
download: s3://ataias.code/Kalman.py to ./Kalman.py
download: s3://ataias.code/Model.py to ./Model.py
download: s3://ataias.code/Plant.py to ./Plant.py
download: s3://ataias.code/GenWindowsData.py to ./GenWindowsData.py
download: s3://ataias.code/.gitignore to ./.gitignore
download: s3://ataias.code/PlantOPC.py to ./PlantOPC.py
download: s3://ataias.code/DelayBlock.py to ./DelayBlock.py
download: s3://ataias.code/ReducedModel.py to ./ReducedModel.py
That’s all for today! Thanks for reading! If you have any doubts or thoughts, please say in the comments. Get in touch!