Skip to main content



AWS CLI is a command line tool developed by Amazon that is widely used by developers and IT system administrators. You can use it with any S3-compatible object storage service, such as Cubbit, to manage your storage objects and buckets.


Before using AWS CLI, you will need to obtain your access key and secret key from the Cubbit Web Console or https://console.[your-tenant] To do this, please follow the instructions on how to create an account and generate these keys.


To install the AWS CLI, follow AWS instructions.


To configure the AWS CLI, type:

aws configure
Using a different profile

If you want to set a different profile (e.g. cubbit), type:

aws configure --profile cubbit

This command will generate a series of prompts, which should be filled out with the previously generated keys:

AWS Access Key ID [None]:       <your Cubbit access key>
AWS Secret Access Key [None]: <your Cubbit secret key>
Default region name [None]: eu-west-1
Default output format [None]:
Press Enter

You can also be more specific and set the maximum number of concurrent requests with this command:

aws configure set default.s3.max_concurrent_requests 20
Using a different profile

If you want to edit a different profile (e.g. cubbit), type:

aws configure set cubbit.s3.max_concurrent_requests 20

How to

The following are some common AWS CLI commands for managing your Cubbit storage objects and buckets:


If you have a custom tenant, remember to replace the endpoint with https://s3.[your-tenant]

Bucket commands

Create a bucket

aws s3 --endpoint mb s3://my-cubbit-bucket

List all buckets

aws s3 --endpoint ls

List the content of a bucket

aws s3 --endpoint ls s3://my-cubbit-bucket

Delete a bucket

aws s3 --endpoint rb s3://my-cubbit-bucket
Delete a bucket

The AWS CLI rb command is used to delete an empty S3 bucket. A bucket must be empty of objects and versioned objects before it can be deleted. However, the --force option can be used to delete the non-versioned objects in the bucket before the bucket is removed. Versioned objects will not be deleted with this parameter.

Object commands

Upload a single file:

aws s3 --endpoint cp sergio.jpg s3://my-cubbit-bucket

Upload a folder

aws s3 --endpoint sync test_folder s3://my-cubbit-bucket
Multipart upload

S3-compatible object storage services support uploading large files in separate chunks of data and uploading them in parallel when the file size is above a certain threshold.

By default, the multipart threshold for AWS CLI is 8MB. This means that any file larger than 8MB will be automatically split into chunks and uploaded together in parallel. To use this feature, simply upload a file that is larger than 8MB in size and AWS CLI takes care of the rest automatically.

For more info, you can refer to the section about Multipart Upload.

Download a single file

aws s3 --endpoint cp s3://my-cubbit-bucket/sergio.jpg ~/Downloads/myfile.jpg

Download a folder

aws cp --recursive s3://my-cubbit-bucket/test_folder ~/Downloads/new_folder

Delete a single file

aws s3 --endpoint rm s3://my-cubbit-bucket/myfile.jpg

Delete all files in a bucket

aws s3 --endpoint rm --recursive s3://my-cubbit-bucket/

If you want to use a different profile (e.g. cubbit), add --profile to the commands, for instance:

aws --endpoint --profile cubbit s3 ls

Testing single S3 APIs

Up to this point, only the aws s3 CLI has been discussed. However, it's important to note that Amazon also provides an alternative CLI named aws s3api.

s3api is a very low-level utility and provides access to single AWS S3 APIs, while the previous commands were more focused on the overall operation.

For instance, let's pick the upload operation. The command

aws s3 --endpoint cp sergio.jpg s3://my-cubbit-bucket

behind the hood calls a few S3 APIs, like the HeadObject and the PutObject. On the other hand, the command

aws s3api --endpoint put-object --bucket my-cubbit-bucket --key sergio.jpg --body ~/sergio.jpg

just performs a very specific PutObject.

Tools for Powershell S3 Module

The S3 module of AWS Tools for PowerShell allows developers and administrators to manage Amazon Simple Storage Service (S3) or other S3-compatible providers from the PowerShell scripting environment. To manage the S3 service, the relevant AWS.Tools.S3 module needs to be installed.

Module installation

From this link, you can download what is needed to install the tools. While from this link you can refer to the official documentation for Windows systems. It is also possible to follow this guide to install and configure AWS.Tools.S3 for Linux and MacOS.

Module configuration


You can skip this part if you're using the default path for AWS CLI credentials.

-ProfileLocation path/credentials

Used to specify the name and location of the credentials file in .ini format (shared with AWS CLI and other AWS SDKs). If this optional parameter is omitted, the cmdlet will first look in the encrypted credentials file used by AWS SDK for .NET and AWS Toolkit for Visual Studio. If the profile is not found, the cmdlet will search in the credentials file in .ini format at the default location: (user's home directory).aws\credentials. If this parameter is specified, the cmdlet will search only in the credentials file in .ini format at the specified location. Since the current directory may change in a shell or during script execution, it is advisable to specify a full path instead of a relative path.

Use of the module

First, you need to start the Powershell. Then run the following command to import the AWS.Tools.S3 module:

Import-Module AWS.Tools.S3

Remember to replace the endpoint with https://s3.[your-tenant] if you have a custom tenant.

This command removes all objects and all versions of objects from the bucket 'bucket-name' and then deletes the bucket. The command will prompt for confirmation before proceeding. Add the -Force option to skip the confirmation. Furthermore, this command will read the default credentials from (user's home directory).aws\credentials since the -ProfileLocation has not been specified.

Remove-S3Bucket -BucketName bucket-name -DeleteBucketContent -Force -EndpointUrl
PowerShell execution policies

In case you experience an error with internal Windows policies that do not allow importing the S3 Tools module for Powershell, you need to change the execution policy with the following commands:

  • The following command retrieves the actual execution policies: Get-ExecutionPolicy
  • With this command, you can temporarily bypass the restrictions. For the Process scope, it affects only the current PowerShell session: Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass, which means that it will have to be started on each new Powershell instance.

Documentation on Powershell execution policies can be found here.

Other examples of commands you can use are listed below:

  • List of available commands
Get-Command -Module AWS.Tools.S3
  • Create a new S3 bucket
New-S3Bucket -BucketName my-bucket -EndpointUrl
  • Upload an object to an S3 bucket
Write-S3Object -BucketName my-bucket -File my-file.txt -EndpointUrl
  • Download an object from an S3 bucket
Read-S3Object -BucketName "my-bucket" -Key "remotepath\myobject" -File "localpath\myfile" -EndpointUrl
  • Remove an object from an S3 bucket
Remove-S3Object -BucketName my-bucket -Key my-object -EndpointUrl