Python with AWS -Create S3 bucket, upload and Download File using Boto 3

In this article, we are going to cover How to Install Python on Ubuntu 24.04 LTS, Install boto 3 using Pip, Install AWS CLI, Configure IAM user using CLI, create S3 bucket and list using the python boto 3 module

Python with AWS -Create S3 bucket, upload and Download File using Boto 3

Step#1:Update Packages on Ubuntu 24.04 LTS

Update packages on Ubuntu 24.04 LTS using below command

sudo apt update
Python with AWS -Create S3 bucket, upload and Download File using Boto 3 1

Step#2:Install Python 3 on Ubuntu 24.04 LTS 

Python 3 comes preinstalled on Ubuntu 24.04 Installing Python on Ubuntu is a straightforward process. Use below command to install python 3 on Ubuntu 24.04 LTS 

sudo apt install python3 
Python with AWS -Create S3 bucket, upload and Download File using Boto 3 2

To check python version use below command

 python3 --version                                                                                                                                                                                                                                                                                                    
Python with AWS -Create S3 bucket, upload and Download File using Boto 3 3

Step#3:Install boto 3 on Ubuntu 24.04 LTS using pip

To install the Boto3 library for Python on Ubuntu 24.04 LTS, you can use the package manager pip, which is the default Python package installer.

Boto3 is commonly used for interacting with Amazon Web Services (AWS) services. Follow these steps:

Install Boto3 using pip:

sudo apt install python3-pip
Python with AWS -Create S3 bucket, upload and Download File using Boto 3 4
sudo apt install python3-boto3

Verify the installation:

python3 -c "import boto3; print(boto3.__version__)"
Python with AWS -Create S3 bucket, upload and Download File using Boto 3 5

This command checks if Boto3 is installed and prints its version.

Step#4:Create IAM User and Configure IAM user using AWS CLI

Before using Boto3, you need to set up authentication credentials for your AWS account using either the IAM Console or the AWS CLI. You can either choose an existing user or create a new one.

For instructions about how to create a user using the IAM Console, see Creating IAM users. Once the user has been created, see Managing access keys to learn how to create and retrieve the keys used to authenticate the user.

If you have the AWS CLI installed, then you can use the aws configure command to configure your credentials file:

To connect AWS using CLI we have configure AWS user using below command

aws configure

it will ask AWS Access Key ID, AWS Secret Access Key, Default region name, Default output format AWS

Access Key ID [None]: *********7G 

AWS Secret Access Key [None]: ***********************px 

Default region name [None]: ap-south-1 

Default output format [None]: json
Python with AWS -Create S3 bucket, upload and Download File using Boto 3 6

Now you have connected your aws account using AWS CLI.

Alternatively, you can create the credentials file yourself. By default, its location is ~/.aws/credentials. At a minimum, the credentials file should specify the access key and secret access key. In this example, the key and secret key for the account are specified in the default profile:

[default]
aws_access_key_id = YOUR_ACCESS_KEY
aws_secret_access_key = YOUR_SECRET_KEY

You may also want to add a default region to the AWS configuration file, which is located by default at ~/.aws/config:

[default]
region=ap-south-1

Alternatively, you can pass a region_name when creating clients and resources.

You have now configured credentials for the default profile as well as a default region to use when creating connections. See Configuration for in-depth configuration sources and options.

Step#5:Using Boto3 with AWS S3

To use Boto3, you must first import it and indicate which service or services you’re going to use:

We can Create a S3 Bucket, Upload Files to the S3 Bucket and Download Files from the S3 Bucket

Below is python code

sudo nano aws-s3.py
Python with AWS -Create S3 bucket, upload and Download File using Boto 3 7
import boto3

# Connect to AWS S3 through Boto3
s3 = boto3.client("s3")

# Specify a valid AWS region code
region = 'ap-south-1'  # Example: Asia Pacific (Mumbai)
  1. Importing Boto3 Library:
    • import boto3: This line imports the Boto3 library, which provides an interface to AWS services in Python.
  2. Connecting to AWS S3:
    • s3 = boto3.client(“s3”): Here, we create an S3 client object named s3 using boto3.client, allowing us to interact with S3 services.
  3. Specifying AWS Region:
    • region = ‘ap-south-1’: This line specifies the AWS region code. In this case, it’s set to ‘ap-south-1’, which corresponds to the Asia Pacific (Mumbai) region.
Python with AWS -Create S3 bucket, upload and Download File using Boto 3 8
# Create S3 Bucket with region constraint
bucketResponse = s3.create_bucket(
Bucket="mydevopshintbucket",
CreateBucketConfiguration={'LocationConstraint': region}
)

# Get the bucket name from the response (Optional, not needed for upload/download)
BUCKET_NAME = "mydevopshintbucket"
  1. Creating an S3 Bucket:
    • bucketResponse = s3.create_bucket(…): This line creates an S3 bucket named “mydevopshintbucket” with a region constraint specified by the CreateBucketConfiguration parameter. The LocationConstraint parameter ensures that the bucket is created in the specified AWS region (region).
  2. Optional: Getting Bucket Name:
    • BUCKET_NAME = “mydevopshintbucket”: This line sets the variable BUCKET_NAME to “mydevopshintbucket”. While not explicitly needed for the upload or download operations in this code, it could be useful for future operations where the bucket name needs to be referenced.
Python with AWS -Create S3 bucket, upload and Download File using Boto 3 9
# Upload Files to the S3 Bucket
filesPath = "/opt/"
fileToUpload = "test.jpg"
fileToS3 = filesPath + fileToUpload
s3.upload_file(fileToS3, BUCKET_NAME, fileToUpload)

print(f"S3 bucket '{BUCKET_NAME}' created successfully in region '{region}'")

Upload Files to S3:

  • filesPath = “/opt/”: Path where the file is located.
  • fileToUpload = “test.jpg”: Name of the file to upload.
  • fileToS3 = filesPath + fileToUpload: Full path of the file.
  • s3.upload_file(fileToS3, BUCKET_NAME, fileToUpload): Uploads fileToS3 to the S3 bucket named BUCKET_NAME with the name fileToUpload.
  • print(f”S3 bucket ‘{BUCKET_NAME}’ created successfully in region ‘{region}'”): Prints a success message confirming the creation of the S3 bucket in the specified region.
# Download File from S3 Bucket
downloadedFile = "downloaded_test.jpg"
s3.download_file(BUCKET_NAME, fileToUpload, downloadedFile)

print(f"File '{fileToUpload}' downloaded from S3 bucket '{BUCKET_NAME}' as '{downloadedFile}'")

Download File from S3:

  • downloadedFile = “downloaded_test.jpg”: Name for the downloaded file.
  • s3.download_file(BUCKET_NAME, fileToUpload, downloadedFile): Downloads the file fileToUpload from the S3 bucket BUCKET_NAME and saves it as downloadedFile.
  • print(f”File ‘{fileToUpload}’ downloaded from S3 bucket ‘{BUCKET_NAME}’ as ‘{downloadedFile}'”): Prints a message confirming the successful download of the file from the specified S3 bucket.

Complete Python File :

    Python with AWS -Create S3 bucket, upload and Download File using Boto 3 10

    Executing the Python Code

    python3 aws-s3.py
    Python with AWS -Create S3 bucket, upload and Download File using Boto 3 11

    Output:-

    Python with AWS -Create S3 bucket, upload and Download File using Boto 3 12
    Python with AWS -Create S3 bucket, upload and Download File using Boto 3 13
    Python with AWS -Create S3 bucket, upload and Download File using Boto 3 14

    Conclusion:

    In conclusion, leveraging Python and Boto3 for AWS S3 operations streamlines file management, offering efficient upload and download capabilities for enhanced workflow automation and productivity. With these tools, developers can seamlessly integrate S3 bucket handling into their applications, optimizing cloud storage management.

    Reference:-

    For reference visit the official website .

    Any queries pls contact us @Fosstechnix.com.

    Related Articles:

    Start and Stop AWS EC2 Instance using Python Boto3

    Akash Bhujbal

    Hey, I am Akash Bhujbal, I am an aspiring DevOps and Cloud enthusiast who is eager to embark on a journey into the world of DevOps and Cloud. With a strong passion for technology and a keen interest in DevOps and Cloud based solutions, I am driven to learn and contribute to the ever-evolving field of DevOps and Cloud.

    Leave a Comment

    This site uses Akismet to reduce spam. Learn how your comment data is processed.

    Share via
    Copy link
    Powered by Social Snap